ChestXNet: Pneumonia Detection on Chest X-rays with Convolutional Neural Networks

In [1]:
import numpy as np 
import pandas as pd 
import matplotlib.pyplot as plt
import os

from glob import glob
from itertools import chain
from random import sample
from sklearn.model_selection import GroupShuffleSplit
from sklearn.metrics import roc_curve, auc, precision_recall_curve, f1_score, confusion_matrix, average_precision_score, accuracy_score

from keras.preprocessing.image import ImageDataGenerator
from keras.layers import GlobalAveragePooling2D, Dense, Dropout, Flatten, Conv2D, MaxPooling2D
from keras.models import Sequential, Model
from keras.callbacks import ModelCheckpoint, EarlyStopping
from keras.optimizers import Adam
from keras.applications.densenet import DenseNet121

%matplotlib inline
plt.rcParams.update({'font.size': 16})
Using TensorFlow backend.

Processing of the metadata:

In [3]:
# Loading the NIH data into dataframe together with full image filepaths for easier manipulation

all_xray_df = pd.read_csv('/data/Data_Entry_2017.csv')
all_image_paths = {os.path.basename(x): x for x in 
                   glob(os.path.join('/data','images*', '*', '*.png'))}
print('Scans found:', len(all_image_paths), ', Total Headers', all_xray_df.shape[0])
all_xray_df['path'] = all_xray_df['Image Index'].map(all_image_paths.get)
all_xray_df.sample(3)
Scans found: 112120 , Total Headers 112120
Out[3]:
Image Index Finding Labels Follow-up # Patient ID Patient Age Patient Gender View Position OriginalImage[Width Height] OriginalImagePixelSpacing[x y] Unnamed: 11 path
100346 00026593_000.png No Finding 0 26593 34 M PA 2992 2991 0.143 0.143 NaN /data/images_011/images/00026593_000.png
75933 00018625_000.png No Finding 0 18625 45 F PA 2782 2901 0.143 0.143 NaN /data/images_009/images/00018625_000.png
75064 00018407_000.png No Finding 0 18407 62 F AP 2500 2048 0.168 0.168 NaN /data/images_009/images/00018407_000.png
In [4]:
# Splitting the "Finding Labels" columns to have a binary column for each finding
finding_labels = np.unique(list(chain(*all_xray_df['Finding Labels'].apply(lambda x: x.split('|')).tolist())))

for label in finding_labels:
    all_xray_df[label] = all_xray_df['Finding Labels'].apply(lambda x: 1.0 if label in x else 0)

# Creating  a new column called 'pneumonia_class' for binary classification
all_xray_df['pneumonia_class']=all_xray_df['Pneumonia'].replace({0.0:'Negative',1.0:'Positive'})

# Moving the index to a separate column 'Scan_ID' for later manipulation during train/validation/test splits
all_xray_df['Scan_ID'] = all_xray_df.index
all_xray_df.head()
Out[4]:
Image Index Finding Labels Follow-up # Patient ID Patient Age Patient Gender View Position OriginalImage[Width Height] OriginalImagePixelSpacing[x ... Hernia Infiltration Mass No Finding Nodule Pleural_Thickening Pneumonia Pneumothorax pneumonia_class Scan_ID
0 00000001_000.png Cardiomegaly 0 1 58 M PA 2682 2749 0.143 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 Negative 0
1 00000001_001.png Cardiomegaly|Emphysema 1 1 58 M PA 2894 2729 0.143 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 Negative 1
2 00000001_002.png Cardiomegaly|Effusion 2 1 58 M PA 2500 2048 0.168 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 Negative 2
3 00000002_000.png No Finding 0 2 81 M PA 2500 2048 0.171 ... 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 Negative 3
4 00000003_000.png Hernia 0 3 81 F PA 2582 2991 0.143 ... 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 Negative 4

5 rows × 30 columns

Creating training, validation and test datasets:

Goals:

  • no patient overlap between the datasets
  • balanced training and validation datasets
  • imbalanced test set with 20% of positive pneumonia cases
In [5]:
# Splitting the data into two portions:
# 1) 80% training and validation datasets
# 2) 20% test dataset
# in such a way that there is no Patient ID overlap between the sets
train_valid_inds, test_inds = next(GroupShuffleSplit(test_size=0.2, 
                                                     n_splits=1, 
                                                     random_state = 16).split(all_xray_df, groups=all_xray_df['Patient ID']))

train_valid_df = all_xray_df.loc[train_valid_inds]
test_df = all_xray_df.loc[test_inds]

# Resetting index for further split
train_valid_df.reset_index(inplace=True)
train_valid_df.head()
Out[5]:
index Image Index Finding Labels Follow-up # Patient ID Patient Age Patient Gender View Position OriginalImage[Width Height] ... Hernia Infiltration Mass No Finding Nodule Pleural_Thickening Pneumonia Pneumothorax pneumonia_class Scan_ID
0 3 00000002_000.png No Finding 0 2 81 M PA 2500 2048 ... 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 Negative 3
1 4 00000003_000.png Hernia 0 3 81 F PA 2582 2991 ... 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 Negative 4
2 5 00000003_001.png Hernia 1 3 74 F PA 2500 2048 ... 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 Negative 5
3 6 00000003_002.png Hernia 2 3 75 F PA 2048 2500 ... 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 Negative 6
4 7 00000003_003.png Hernia|Infiltration 3 3 76 F PA 2698 2991 ... 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 Negative 7

5 rows × 31 columns

In [6]:
# Splitting the train_valid dataset into two portions:
# 1) 90% training dataset
# 2) 10% validation dataset
# in such a way that there is no Patient ID overlap between the sets
train_inds, valid_inds = next(GroupShuffleSplit(test_size=0.1, 
                                                n_splits=1, 
                                                random_state = 16).split(train_valid_df, groups=train_valid_df['Patient ID']))

train_df = train_valid_df.loc[train_inds]
valid_df = train_valid_df.loc[valid_inds] 
In [7]:
# Confirming that there is no Scan_ID overlap between the datasets
train_scan_ids = train_df['Scan_ID'].to_numpy()
valid_scan_ids = valid_df['Scan_ID'].to_numpy()
test_scan_ids = test_df['Scan_ID'].to_numpy()
print(f'Scan ID overlap between train and valid datasets: {np.intersect1d(train_scan_ids, valid_scan_ids)}') 
print(f'Scan ID overlap between train and test datasets: {np.intersect1d(train_scan_ids, valid_scan_ids)}') 
print(f'Scan ID overlap between valid and test datasets: {np.intersect1d(valid_scan_ids, test_scan_ids)}') 
Scan ID overlap between train and valid datasets: []
Scan ID overlap between train and test datasets: []
Scan ID overlap between valid and test datasets: []
In [8]:
# Confirming that there is no Patient_ID overlap between the datasets
train_patient_ids = train_df['Patient ID'].to_numpy()
valid_patient_ids = valid_df['Patient ID'].to_numpy()
test_patient_ids = test_df['Patient ID'].to_numpy()
print(f'Patient ID overlap between train and valid datasets: {np.intersect1d(np.unique(train_patient_ids), np.unique(valid_patient_ids))}') 
print(f'Patient ID overlap between train and test datasets: {np.intersect1d(np.unique(train_patient_ids), np.unique(test_patient_ids))}')
print(f'Patient ID overlap between valid and test datasets: {np.intersect1d(np.unique(valid_patient_ids), np.unique(test_patient_ids))}')
Patient ID overlap between train and valid datasets: []
Patient ID overlap between train and test datasets: []
Patient ID overlap between valid and test datasets: []
In [9]:
# Checking how much pneumonia cases are there in the training and in the validation datasets
train_df_pneumonia_cases = len(train_df[train_df.Pneumonia==1])
valid_df_pneumonia_cases = len(valid_df[valid_df.Pneumonia==1])
test_df_pneumonia_cases = len(test_df[test_df.Pneumonia==1])
print(f'Pneumonia cases in the training set: {train_df_pneumonia_cases}.')
print(f'Pneumonia cases in the validation set: {valid_df_pneumonia_cases}.')
print(f'Pneumonia cases in the test set: {test_df_pneumonia_cases}.')
print(f'Percentage of pneumonia cases, which were assigned to the training set: {train_df_pneumonia_cases/(train_df_pneumonia_cases + valid_df_pneumonia_cases + test_df_pneumonia_cases)*100:.3}%')
print(f'Percentage of pneumonia cases, which were assigned to the validation set: {valid_df_pneumonia_cases/(train_df_pneumonia_cases + valid_df_pneumonia_cases + test_df_pneumonia_cases)*100:.3}%')
print(f'Percentage of pneumonia cases, which were assigned to the test set: {test_df_pneumonia_cases/(train_df_pneumonia_cases + valid_df_pneumonia_cases + test_df_pneumonia_cases)*100:.3}%')
Pneumonia cases in the training set: 1070.
Pneumonia cases in the validation set: 87.
Pneumonia cases in the test set: 274.
Percentage of pneumonia cases, which were assigned to the training set: 74.8%
Percentage of pneumonia cases, which were assigned to the validation set: 6.08%
Percentage of pneumonia cases, which were assigned to the test set: 19.1%
In [10]:
# Sampling randomly negative cases, such that the traning dataset consist of 50% positive cases.
train_scan_ids_pos = train_df[train_df.Pneumonia==1].index.to_numpy()
train_scan_ids_neg = train_df[train_df.Pneumonia==0].index.to_numpy()
train_scan_ids_neg_sample = np.random.choice(train_scan_ids_neg, size=train_df_pneumonia_cases, replace=False)
train_scan_ids_pos_neg_sample = np.concatenate((train_scan_ids_pos, train_scan_ids_neg_sample))
train_df = train_df.loc[train_scan_ids_pos_neg_sample]
train_df = train_df.sample(frac=1)
train_df.head()
Out[10]:
index Image Index Finding Labels Follow-up # Patient ID Patient Age Patient Gender View Position OriginalImage[Width Height] ... Hernia Infiltration Mass No Finding Nodule Pleural_Thickening Pneumonia Pneumothorax pneumonia_class Scan_ID
45461 56830 00014127_000.png No Finding 0 14127 35 M PA 2992 2991 ... 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 Negative 56830
12791 16260 00004342_054.png Cardiomegaly|Edema|Effusion|Pneumonia 54 4342 48 M AP 2500 2048 ... 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 Positive 16260
62559 78142 00019176_052.png Consolidation|Infiltration|Pneumonia 52 19176 66 F AP 3056 2544 ... 0.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 Positive 78142
32470 40964 00010609_004.png No Finding 4 10609 36 F PA 2516 2330 ... 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 Negative 40964
61545 76788 00018865_008.png Emphysema|Pneumothorax 8 18865 69 F PA 2722 2991 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 Negative 76788

5 rows × 31 columns

In [11]:
# Sampling randomly negative cases, such that the validation dataset consist of 50% positive cases.
valid_scan_ids_pos = valid_df[valid_df.Pneumonia==1].index.to_numpy()
valid_scan_ids_neg = valid_df[valid_df.Pneumonia==0].index.to_numpy()
valid_scan_ids_neg_sample = np.random.choice(valid_scan_ids_neg, size=valid_df_pneumonia_cases, replace=False)
valid_scan_ids_pos_neg_sample = np.concatenate((valid_scan_ids_pos, valid_scan_ids_neg_sample))
valid_df = valid_df.loc[valid_scan_ids_pos_neg_sample]
valid_df = valid_df.sample(frac=1)
In [12]:
# Sampling randomly negative cases, such that the test dataset consist of 20% positive cases.
test_scan_ids_pos = test_df[test_df.Pneumonia==1].index.to_numpy()
test_scan_ids_neg = test_df[test_df.Pneumonia==0].index.to_numpy()
test_scan_ids_neg_sample = np.random.choice(test_scan_ids_neg, size=4*test_df_pneumonia_cases, replace=False)
test_scan_ids_pos_neg_sample = np.concatenate((test_scan_ids_pos, test_scan_ids_neg_sample))
test_df = test_df.loc[test_scan_ids_pos_neg_sample]
test_df = test_df.sample(frac=1)
In [13]:
# Determination of the number of scans in each dataset
print(f'Number of scans in the training set: {len(train_df)}.')
print(f'Number of scans in the validation set: {len(valid_df)}.')
print(f'Number of scans in the test set: {len(test_df)}.')
Number of scans in the training set: 2140.
Number of scans in the validation set: 174.
Number of scans in the test set: 1370.

Comparison of demographic distributions in the training, validation and test datasets:

In [14]:
# Distribution of findings other than pneumonia that are present in the scans

finding_labels_no_pneumonia = np.delete(finding_labels, [10,13]) 
    #deleting "no finding" and "pneumonia" label
    
finding_labels_no_pneumonia_occurance_train = train_df.loc[:,finding_labels_no_pneumonia].sum()/len(train_df.loc[:,finding_labels_no_pneumonia])
finding_labels_no_pneumonia_occurance_valid = valid_df.loc[:,finding_labels_no_pneumonia].sum()/len(valid_df.loc[:,finding_labels_no_pneumonia])
finding_labels_no_pneumonia_occurance_test = test_df.loc[:,finding_labels_no_pneumonia].sum()/len(test_df.loc[:,finding_labels_no_pneumonia])

fig, ax = plt.subplots(3,1, figsize=(15,20))
ax[0].bar(finding_labels_no_pneumonia, finding_labels_no_pneumonia_occurance_train, color='blue')
ax[0].set_title("Training set: Distribution of findings other than pneumonia")
ax[0].set_ylabel('% of scans')
ax[0].set_xticklabels(finding_labels_no_pneumonia, rotation=90)

ax[1].bar(finding_labels_no_pneumonia, finding_labels_no_pneumonia_occurance_valid, color='red')
ax[1].set_title("Validation set: Distribution of findings other than pneumonia")
ax[1].set_ylabel('% of scans')
ax[1].set_xticklabels(finding_labels_no_pneumonia, rotation=90)

ax[2].bar(finding_labels_no_pneumonia, finding_labels_no_pneumonia_occurance_test, color='green')
ax[2].set_title("Test set: Distribution of findings other than pneumonia")
ax[2].set_ylabel('% of scans')
ax[2].set_xticklabels(finding_labels_no_pneumonia, rotation=90)
fig.tight_layout()
  • The biggest difference is in the occurance of Edema, Hernia and Fibrosis. Their coocurence with pheumonia will be investiaged in the next step.
In [15]:
# Distribution of co-ocurrent findings together with pneumonia

fig, ax = plt.subplots(1,3, figsize=(20,6))
fig.suptitle("Pneumonia and most common coexisting findings", fontsize=22, y = 1, x = 0.52)

train_df_coocurrent = train_df[train_df.Pneumonia==1]['Finding Labels'].value_counts()[0:5]
valid_df_coocurrent = valid_df[valid_df.Pneumonia==1]['Finding Labels'].value_counts()[0:5]
test_df_coocurrent = test_df[test_df.Pneumonia==1]['Finding Labels'].value_counts()[0:5]

ax[0].bar(train_df_coocurrent.index, train_df_coocurrent, color='blue')
ax[0].set_title("Training dataset")
ax[0].set_xticklabels(train_df[train_df.Pneumonia==1]['Finding Labels'][0:5], rotation=90)
ax[0].set_ylabel('# of scans')

ax[1].bar(valid_df_coocurrent.index, valid_df_coocurrent, color='red')
ax[1].set_title("Validation dataset")
ax[1].set_xticklabels(valid_df[valid_df.Pneumonia==1]['Finding Labels'][0:5], rotation=90)

ax[2].bar(test_df_coocurrent.index, test_df_coocurrent, color='green')
ax[2].set_title("Test dataset")
ax[2].set_xticklabels(test_df[test_df.Pneumonia==1]['Finding Labels'][0:5], rotation=90)

plt.show()

Coocurent findings:

  • In train and validation datasets Pneumonia most frequently is present in a scan together with other findings, whose combination varies between those datasets.

Coocurence with Edema, Hernia and Fibrosis:

  • In the training dataset, the 1st most common coocurence with Pneumonia is Edema together with Effusion and Cardiomegaly.
  • In the validation dataset, the 1st most common coocurence with Pneumonia is Edema together with Infiltration.
  • In the test dataset, the 3rd most common coocurence with Pneumonia is Edema.
  • Hernia and Firbosis are not one of the most common coocurances with Pneumonia.


During in-depth evaluation of the best model, possible cross-talk existance between pneumonia and the most common coocurent findings has to be investigated.

In [16]:
# Distribution of age 
fig, ax = plt.subplots(1,3, figsize=(15,6))
fig.suptitle("Patients' age distribution", fontsize=22, y = 1, x = 0.52)

ax[0].hist(train_df['Patient Age'], bins=50, range=(0,100), color='blue')
ax[0].set_title("Training data set")
ax[0].set_xlabel('Age')
ax[0].set_ylabel('# of X-ray scans')
ax[1].hist(valid_df['Patient Age'], bins=50, range=(0,100), color='red')
ax[1].set_title("Validation dataset")
ax[1].set_xlabel('Age')
ax[2].hist(test_df['Patient Age'], bins=50, range=(0,100), color='green')
ax[2].set_title("Test dataset")
ax[2].set_xlabel('Age')
fig.tight_layout(pad=2.0)
plt.show()
In [17]:
# Distribution of patients' gender 
train_df_gender = train_df['Patient Gender'].value_counts()
valid_df_gender = valid_df['Patient Gender'].value_counts()
test_df_gender = test_df['Patient Gender'].value_counts()

fig, ax = plt.subplots(1,3, figsize=(15,6))
fig.suptitle("Patients' gender distribution", fontsize=22, y = 1, x = 0.52)

ax[0].bar(train_df_gender.index, train_df_gender, color='blue')
ax[0].set_title("Training dataset")
ax[0].set_xlabel('Gender')
ax[0].set_ylabel('# of X-ray scans')

ax[1].bar(valid_df_gender.index, valid_df_gender, color='red')
ax[1].set_title("Validation dataset")
ax[1].set_xlabel('Gender')

ax[2].bar(test_df_gender.index, test_df_gender, color='green')
ax[2].set_title("Test dataset")
ax[2].set_xlabel('Gender')

fig.tight_layout(pad=2.0)
plt.show()
In [18]:
M_F_ratio_train = train_df['Patient Gender'].value_counts()[0] / train_df['Patient Gender'].value_counts()[1] 
print(f'M/F ratio: Training dataset {M_F_ratio_train:.2}.')

M_F_ratio_valid = valid_df['Patient Gender'].value_counts()[0] / valid_df['Patient Gender'].value_counts()[1] 
print(f'M/F ratio: Validation dataset {M_F_ratio_valid:.2}.')

M_F_ratio_test = test_df['Patient Gender'].value_counts()[0] / test_df['Patient Gender'].value_counts()[1] 
print(f'M/F ratio: Test dataset {M_F_ratio_test:.2}.')
M/F ratio: Training dataset 1.3.
M/F ratio: Validation dataset 1.3.
M/F ratio: Test dataset 1.4.
In [19]:
# Distribution of view position in the train and the validaton set
train_df_view = train_df['View Position'].value_counts()
valid_df_view = valid_df['View Position'].value_counts()
test_df_view = test_df['View Position'].value_counts()

fig, ax = plt.subplots(1,3, figsize=(15,6))
fig.suptitle("View positions", fontsize=22, y = 1, x = 0.52)

ax[0].bar(train_df_view.index, train_df_view, color='blue')
ax[0].set_title("Training dataset")
ax[0].set_xlabel('View position')
ax[0].set_ylabel('# of X-ray scans')

ax[1].bar(valid_df_view.index, valid_df_view, color='red')
ax[1].set_title("Validation dataset")
ax[1].set_xlabel('View position')

ax[2].bar(test_df_view.index, test_df_view, color='green')
ax[2].set_title("Test dataset")
ax[2].set_xlabel('View position')

fig.tight_layout(pad=2.0)
plt.show()
In [20]:
PA_AP_ratio_train = train_df['View Position'].value_counts()[0] / train_df['View Position'].value_counts()[1] 
print(f'PA/AP ratio: Training dataset {PA_AP_ratio_train:.2}.')

PA_AP_ratio_valid = valid_df['View Position'].value_counts()[0] / valid_df['View Position'].value_counts()[1] 
print(f'PA/AP ratio: Validation dataset {PA_AP_ratio_valid:.2}.')

PA_AP_ratio_test = test_df['View Position'].value_counts()[0] / test_df['View Position'].value_counts()[1] 
print(f'PA/AP ratio: Test dataset {PA_AP_ratio_test:.2}.')
PA/AP ratio: Training dataset 1.1.
PA/AP ratio: Validation dataset 1.4.
PA/AP ratio: Test dataset 1.3.

PA projection:

  • patient is standing: X-ray beam passes through the patient from Posterior to Anterior (i.e. from back to front)

AP projection:

  • patient is sitting: X-ray beam passes through the patient from Anterior to Posterior (i.e. from front to back)


Differences between PA and AP views:

  • heart size is exaggerated in the AP projection, as it is realtively further from the detector and and the X-ray beam is more divergent.
  • AP projection images are of lower quality then PA images.

Ref: https://www.radiologymasterclass.co.uk/tutorials/chest/chest_quality/chest_xray_quality_projection

In [21]:
# Distribution of number of scans per patient in the train and the validaton set
fig, ax = plt.subplots(1,3, figsize=(15,6))
fig.suptitle("Numer of scans per patient distribution", fontsize=22, y = 1, x = 0.52)

ax[0].hist(train_df['Patient ID'].value_counts(), bins=20, log=True, color='blue')
ax[0].set_title("Train dataset")
ax[0].set_xlabel('# of scans')
ax[0].set_ylabel('# of patients')

ax[1].hist(valid_df['Patient ID'].value_counts(), bins=20, log=True, color='red')
ax[1].set_title("Validation dataset")
ax[1].set_xlabel('# of scans')

ax[2].hist(test_df['Patient ID'].value_counts(), bins=20, log=True, color='green')
ax[2].set_title("Test dataset:")
ax[2].set_xlabel('# of scans')

fig.tight_layout(pad=2.0)
plt.show()

Saving and loading of training, validation and test datasets:

In [22]:
# Saving train_df, valid_df and test_df
train_df.to_pickle('train_df.pkl')
valid_df.to_pickle('valid_df.pkl')
test_df.to_pickle('test_df.pkl')
In [2]:
# Loading train_df, valid_df and test_df
train_df = pd.read_pickle('train_df.pkl')
valid_df = pd.read_pickle('valid_df.pkl')
test_df = pd.read_pickle('test_df.pkl')

Functions for training, validation and test generators

In [3]:
def image_generator_train(samplewise_center=True, samplewise_std_normalization=True, 
                             height_shift_range=0.1, width_shift_range=0.1, 
                             rotation_range=2, zoom_range=0.01):
    
    """
    Return training image data generator with augumented images
    
    Args:
    samplewise_center (Boolean): Set each sample mean to 0.
    samplewise_std_normalization (Boolean): Divide each input by its std.
    height_shift_range (Float): Fraction of total height.
    width_shift_range (Float): Fraction of total width.
    rotation_range (Int): Degree range for random rotations.
    zoom_range (Float or [lower, upper]): Range for random zoom.
    
    Returns:
        train_idg (ImageDataGenrator): image generator for training set
    """
    
    train_idg = ImageDataGenerator(samplewise_center = samplewise_center,
                                   samplewise_std_normalization = samplewise_std_normalization,
                                   height_shift_range = height_shift_range,
                                   width_shift_range = width_shift_range,
                                   rotation_range = rotation_range,
                                   zoom_range = zoom_range)
    
    return train_idg



def get_generator_train(train_idg, train_df, directory=None, x_col='path', y_col='pneumonia_class', 
                  class_mode='binary', target_size=(224, 224), batch_size=64, shuffle=True, seed=16):
    
    """
    Return generator for the training set, reading the image paths to load from dataframe
    
    Args:
      train_idg (ImageDataGenerator) image data generator for the training set.
      train_df (dataframe): dataframe containing training data.
      directory (str): directory, in which image files are held.
      x_col (str): name of column in dataframe that holds filenames.
      y_col (str): name of column in dataframe that holds filenames the label for images.
      class_mode (str): one of "binary", "categorical", "input", "multi_output", "raw", sparse" or None. 
      target_size (tuple of integers (height, width)): default: The dimensions to which all images found will be resized.
      batch_size (int): images per batch to be fed into model during training.
      shuffle (Boolean): whether to shuffle the data 
      seed (int): random seed.
    
    Returns:
        train_gen (DataFrameIterator): iterator over training set
    """
    
    train_gen = train_idg.flow_from_dataframe(dataframe = train_df, 
                                         directory=directory,
                                         x_col = x_col,
                                         y_col = y_col,   
                                         class_mode = class_mode,
                                         target_size = target_size,
                                         shuffle = shuffle,
                                         seed = seed,
                                         batch_size = batch_size)     

    return train_gen
In [4]:
def get_test_valid_generator(samplewise_center=True, samplewise_std_normalization=True, train_df=train_df, valid_df=valid_df,
                             test_df=test_df, directory=None, x_col='path', y_col='pneumonia_class', class_mode='binary', 
                             target_size=(224, 224), sample_size = 100, batch_size=64, shuffle=True, seed=16):
    
    """
    Return generator for validation set and test test set using 
    statistics from training set.

    Args:
        samplewise_center (Boolean): Set each sample mean to 0.
        samplewise_std_normalization (Boolean): Divide each input by its std.
        train_df (dataframe): dataframe containing training data.
        valid_df (dataframe): dataframe containing validation data.
        test_df (dataframe): dataframe containing test data.
        directory (str): directory, in which image files are held.
        x_col (str): name of column in dataframe that holds filenames.
        y_col (str): name of column in dataframe that holds filenames the label for images.
        class_mode (str): one of "binary", "categorical", "input", "multi_output", "raw", sparse" or None. 
        target_size (tuple of integers (height, width)): default: The dimensions to which all images found will be resized.
        sample_size (int): size of sample to use for normalization statistics.
        batch_size (int): images per batch to be fed into model during training.
        shuffle (Boolean): whether to shuffle the data 
        seed (int): random seed.

    Returns:
        valid_generator and test_generator (DataFrameIterator): iterators over validation and test datsets respectively
    """
     
    # get generator for sample dataset from training set    
    raw_train_generator = ImageDataGenerator().flow_from_dataframe(
                                        dataframe = train_df, 
                                        directory = directory, 
                                        x_col = x_col, 
                                        y_col = y_col, 
                                        class_mode = class_mode,
                                        target_size = target_size,
                                        shuffle = shuffle,
                                        seed = seed,
                                        batch_size = sample_size)
    
    # get one batch of sample data
    batch = raw_train_generator.next()
    data_sample = batch[0]
    
    # get mean and std from sample data to normalize images in test set generator
    image_generator = ImageDataGenerator(featurewise_center=True, featurewise_std_normalization= True)
    image_generator.fit(data_sample)
    print(f'Mean of {sample_size} images from the training set: {np.mean(data_sample):.4}')
    print(f'Std of {sample_size} images from the training set: {np.std(data_sample):.4}')
    np.save('Training_sample_mean', np.mean(data_sample))
    np.save('Training_sample_std', np.std(data_sample))
    
    
    # get validation generator
    valid_gen = image_generator.flow_from_dataframe(dataframe = valid_df, 
                                             directory = directory, 
                                             x_col = x_col,
                                             y_col = y_col, 
                                             class_mode = class_mode,
                                             target_size = target_size, 
                                             batch_size = batch_size,
                                             shuffle = False,
                                             seed = seed) 
    
    # get test generator
    test_gen = image_generator.flow_from_dataframe(dataframe = test_df, 
                                             directory = directory, 
                                             x_col = x_col,
                                             y_col = y_col, 
                                             class_mode = class_mode,
                                             target_size = target_size, 
                                             batch_size = batch_size,
                                             shuffle = False,
                                             seed = seed) 
    
    return valid_gen, test_gen

Fuctions for model building

In [24]:
# The models in this notebook will be based on DenseNet121 pretrained model with imagenet weights
model = DenseNet121(include_top=True, weights='imagenet')
model.summary()
Model: "densenet121"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_5 (InputLayer)            (None, 224, 224, 3)  0                                            
__________________________________________________________________________________________________
zero_padding2d_9 (ZeroPadding2D (None, 230, 230, 3)  0           input_5[0][0]                    
__________________________________________________________________________________________________
conv1/conv (Conv2D)             (None, 112, 112, 64) 9408        zero_padding2d_9[0][0]           
__________________________________________________________________________________________________
conv1/bn (BatchNormalization)   (None, 112, 112, 64) 256         conv1/conv[0][0]                 
__________________________________________________________________________________________________
conv1/relu (Activation)         (None, 112, 112, 64) 0           conv1/bn[0][0]                   
__________________________________________________________________________________________________
zero_padding2d_10 (ZeroPadding2 (None, 114, 114, 64) 0           conv1/relu[0][0]                 
__________________________________________________________________________________________________
pool1 (MaxPooling2D)            (None, 56, 56, 64)   0           zero_padding2d_10[0][0]          
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, 56, 56, 64)   256         pool1[0][0]                      
__________________________________________________________________________________________________
conv2_block1_0_relu (Activation (None, 56, 56, 64)   0           conv2_block1_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D)    (None, 56, 56, 128)  8192        conv2_block1_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, 56, 56, 128)  512         conv2_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, 56, 56, 128)  0           conv2_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D)    (None, 56, 56, 32)   36864       conv2_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_concat (Concatenat (None, 56, 56, 96)   0           pool1[0][0]                      
                                                                 conv2_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_0_bn (BatchNormali (None, 56, 56, 96)   384         conv2_block1_concat[0][0]        
__________________________________________________________________________________________________
conv2_block2_0_relu (Activation (None, 56, 56, 96)   0           conv2_block2_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D)    (None, 56, 56, 128)  12288       conv2_block2_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, 56, 56, 128)  512         conv2_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, 56, 56, 128)  0           conv2_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D)    (None, 56, 56, 32)   36864       conv2_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_concat (Concatenat (None, 56, 56, 128)  0           conv2_block1_concat[0][0]        
                                                                 conv2_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_0_bn (BatchNormali (None, 56, 56, 128)  512         conv2_block2_concat[0][0]        
__________________________________________________________________________________________________
conv2_block3_0_relu (Activation (None, 56, 56, 128)  0           conv2_block3_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D)    (None, 56, 56, 128)  16384       conv2_block3_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, 56, 56, 128)  512         conv2_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, 56, 56, 128)  0           conv2_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D)    (None, 56, 56, 32)   36864       conv2_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_concat (Concatenat (None, 56, 56, 160)  0           conv2_block2_concat[0][0]        
                                                                 conv2_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block4_0_bn (BatchNormali (None, 56, 56, 160)  640         conv2_block3_concat[0][0]        
__________________________________________________________________________________________________
conv2_block4_0_relu (Activation (None, 56, 56, 160)  0           conv2_block4_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block4_1_conv (Conv2D)    (None, 56, 56, 128)  20480       conv2_block4_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block4_1_bn (BatchNormali (None, 56, 56, 128)  512         conv2_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block4_1_relu (Activation (None, 56, 56, 128)  0           conv2_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block4_2_conv (Conv2D)    (None, 56, 56, 32)   36864       conv2_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block4_concat (Concatenat (None, 56, 56, 192)  0           conv2_block3_concat[0][0]        
                                                                 conv2_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block5_0_bn (BatchNormali (None, 56, 56, 192)  768         conv2_block4_concat[0][0]        
__________________________________________________________________________________________________
conv2_block5_0_relu (Activation (None, 56, 56, 192)  0           conv2_block5_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block5_1_conv (Conv2D)    (None, 56, 56, 128)  24576       conv2_block5_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block5_1_bn (BatchNormali (None, 56, 56, 128)  512         conv2_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block5_1_relu (Activation (None, 56, 56, 128)  0           conv2_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block5_2_conv (Conv2D)    (None, 56, 56, 32)   36864       conv2_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block5_concat (Concatenat (None, 56, 56, 224)  0           conv2_block4_concat[0][0]        
                                                                 conv2_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block6_0_bn (BatchNormali (None, 56, 56, 224)  896         conv2_block5_concat[0][0]        
__________________________________________________________________________________________________
conv2_block6_0_relu (Activation (None, 56, 56, 224)  0           conv2_block6_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block6_1_conv (Conv2D)    (None, 56, 56, 128)  28672       conv2_block6_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block6_1_bn (BatchNormali (None, 56, 56, 128)  512         conv2_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block6_1_relu (Activation (None, 56, 56, 128)  0           conv2_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block6_2_conv (Conv2D)    (None, 56, 56, 32)   36864       conv2_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block6_concat (Concatenat (None, 56, 56, 256)  0           conv2_block5_concat[0][0]        
                                                                 conv2_block6_2_conv[0][0]        
__________________________________________________________________________________________________
pool2_bn (BatchNormalization)   (None, 56, 56, 256)  1024        conv2_block6_concat[0][0]        
__________________________________________________________________________________________________
pool2_relu (Activation)         (None, 56, 56, 256)  0           pool2_bn[0][0]                   
__________________________________________________________________________________________________
pool2_conv (Conv2D)             (None, 56, 56, 128)  32768       pool2_relu[0][0]                 
__________________________________________________________________________________________________
pool2_pool (AveragePooling2D)   (None, 28, 28, 128)  0           pool2_conv[0][0]                 
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, 28, 28, 128)  512         pool2_pool[0][0]                 
__________________________________________________________________________________________________
conv3_block1_0_relu (Activation (None, 28, 28, 128)  0           conv3_block1_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D)    (None, 28, 28, 128)  16384       conv3_block1_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, 28, 28, 128)  0           conv3_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_concat (Concatenat (None, 28, 28, 160)  0           pool2_pool[0][0]                 
                                                                 conv3_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_0_bn (BatchNormali (None, 28, 28, 160)  640         conv3_block1_concat[0][0]        
__________________________________________________________________________________________________
conv3_block2_0_relu (Activation (None, 28, 28, 160)  0           conv3_block2_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D)    (None, 28, 28, 128)  20480       conv3_block2_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, 28, 28, 128)  0           conv3_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_concat (Concatenat (None, 28, 28, 192)  0           conv3_block1_concat[0][0]        
                                                                 conv3_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_0_bn (BatchNormali (None, 28, 28, 192)  768         conv3_block2_concat[0][0]        
__________________________________________________________________________________________________
conv3_block3_0_relu (Activation (None, 28, 28, 192)  0           conv3_block3_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D)    (None, 28, 28, 128)  24576       conv3_block3_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, 28, 28, 128)  0           conv3_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_concat (Concatenat (None, 28, 28, 224)  0           conv3_block2_concat[0][0]        
                                                                 conv3_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_0_bn (BatchNormali (None, 28, 28, 224)  896         conv3_block3_concat[0][0]        
__________________________________________________________________________________________________
conv3_block4_0_relu (Activation (None, 28, 28, 224)  0           conv3_block4_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D)    (None, 28, 28, 128)  28672       conv3_block4_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, 28, 28, 128)  0           conv3_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_concat (Concatenat (None, 28, 28, 256)  0           conv3_block3_concat[0][0]        
                                                                 conv3_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block5_0_bn (BatchNormali (None, 28, 28, 256)  1024        conv3_block4_concat[0][0]        
__________________________________________________________________________________________________
conv3_block5_0_relu (Activation (None, 28, 28, 256)  0           conv3_block5_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block5_1_conv (Conv2D)    (None, 28, 28, 128)  32768       conv3_block5_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block5_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block5_1_relu (Activation (None, 28, 28, 128)  0           conv3_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block5_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block5_concat (Concatenat (None, 28, 28, 288)  0           conv3_block4_concat[0][0]        
                                                                 conv3_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block6_0_bn (BatchNormali (None, 28, 28, 288)  1152        conv3_block5_concat[0][0]        
__________________________________________________________________________________________________
conv3_block6_0_relu (Activation (None, 28, 28, 288)  0           conv3_block6_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block6_1_conv (Conv2D)    (None, 28, 28, 128)  36864       conv3_block6_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block6_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block6_1_relu (Activation (None, 28, 28, 128)  0           conv3_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block6_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block6_concat (Concatenat (None, 28, 28, 320)  0           conv3_block5_concat[0][0]        
                                                                 conv3_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block7_0_bn (BatchNormali (None, 28, 28, 320)  1280        conv3_block6_concat[0][0]        
__________________________________________________________________________________________________
conv3_block7_0_relu (Activation (None, 28, 28, 320)  0           conv3_block7_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block7_1_conv (Conv2D)    (None, 28, 28, 128)  40960       conv3_block7_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block7_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block7_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block7_1_relu (Activation (None, 28, 28, 128)  0           conv3_block7_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block7_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block7_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block7_concat (Concatenat (None, 28, 28, 352)  0           conv3_block6_concat[0][0]        
                                                                 conv3_block7_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block8_0_bn (BatchNormali (None, 28, 28, 352)  1408        conv3_block7_concat[0][0]        
__________________________________________________________________________________________________
conv3_block8_0_relu (Activation (None, 28, 28, 352)  0           conv3_block8_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block8_1_conv (Conv2D)    (None, 28, 28, 128)  45056       conv3_block8_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block8_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block8_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block8_1_relu (Activation (None, 28, 28, 128)  0           conv3_block8_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block8_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block8_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block8_concat (Concatenat (None, 28, 28, 384)  0           conv3_block7_concat[0][0]        
                                                                 conv3_block8_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block9_0_bn (BatchNormali (None, 28, 28, 384)  1536        conv3_block8_concat[0][0]        
__________________________________________________________________________________________________
conv3_block9_0_relu (Activation (None, 28, 28, 384)  0           conv3_block9_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block9_1_conv (Conv2D)    (None, 28, 28, 128)  49152       conv3_block9_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block9_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block9_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block9_1_relu (Activation (None, 28, 28, 128)  0           conv3_block9_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block9_2_conv (Conv2D)    (None, 28, 28, 32)   36864       conv3_block9_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block9_concat (Concatenat (None, 28, 28, 416)  0           conv3_block8_concat[0][0]        
                                                                 conv3_block9_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block10_0_bn (BatchNormal (None, 28, 28, 416)  1664        conv3_block9_concat[0][0]        
__________________________________________________________________________________________________
conv3_block10_0_relu (Activatio (None, 28, 28, 416)  0           conv3_block10_0_bn[0][0]         
__________________________________________________________________________________________________
conv3_block10_1_conv (Conv2D)   (None, 28, 28, 128)  53248       conv3_block10_0_relu[0][0]       
__________________________________________________________________________________________________
conv3_block10_1_bn (BatchNormal (None, 28, 28, 128)  512         conv3_block10_1_conv[0][0]       
__________________________________________________________________________________________________
conv3_block10_1_relu (Activatio (None, 28, 28, 128)  0           conv3_block10_1_bn[0][0]         
__________________________________________________________________________________________________
conv3_block10_2_conv (Conv2D)   (None, 28, 28, 32)   36864       conv3_block10_1_relu[0][0]       
__________________________________________________________________________________________________
conv3_block10_concat (Concatena (None, 28, 28, 448)  0           conv3_block9_concat[0][0]        
                                                                 conv3_block10_2_conv[0][0]       
__________________________________________________________________________________________________
conv3_block11_0_bn (BatchNormal (None, 28, 28, 448)  1792        conv3_block10_concat[0][0]       
__________________________________________________________________________________________________
conv3_block11_0_relu (Activatio (None, 28, 28, 448)  0           conv3_block11_0_bn[0][0]         
__________________________________________________________________________________________________
conv3_block11_1_conv (Conv2D)   (None, 28, 28, 128)  57344       conv3_block11_0_relu[0][0]       
__________________________________________________________________________________________________
conv3_block11_1_bn (BatchNormal (None, 28, 28, 128)  512         conv3_block11_1_conv[0][0]       
__________________________________________________________________________________________________
conv3_block11_1_relu (Activatio (None, 28, 28, 128)  0           conv3_block11_1_bn[0][0]         
__________________________________________________________________________________________________
conv3_block11_2_conv (Conv2D)   (None, 28, 28, 32)   36864       conv3_block11_1_relu[0][0]       
__________________________________________________________________________________________________
conv3_block11_concat (Concatena (None, 28, 28, 480)  0           conv3_block10_concat[0][0]       
                                                                 conv3_block11_2_conv[0][0]       
__________________________________________________________________________________________________
conv3_block12_0_bn (BatchNormal (None, 28, 28, 480)  1920        conv3_block11_concat[0][0]       
__________________________________________________________________________________________________
conv3_block12_0_relu (Activatio (None, 28, 28, 480)  0           conv3_block12_0_bn[0][0]         
__________________________________________________________________________________________________
conv3_block12_1_conv (Conv2D)   (None, 28, 28, 128)  61440       conv3_block12_0_relu[0][0]       
__________________________________________________________________________________________________
conv3_block12_1_bn (BatchNormal (None, 28, 28, 128)  512         conv3_block12_1_conv[0][0]       
__________________________________________________________________________________________________
conv3_block12_1_relu (Activatio (None, 28, 28, 128)  0           conv3_block12_1_bn[0][0]         
__________________________________________________________________________________________________
conv3_block12_2_conv (Conv2D)   (None, 28, 28, 32)   36864       conv3_block12_1_relu[0][0]       
__________________________________________________________________________________________________
conv3_block12_concat (Concatena (None, 28, 28, 512)  0           conv3_block11_concat[0][0]       
                                                                 conv3_block12_2_conv[0][0]       
__________________________________________________________________________________________________
pool3_bn (BatchNormalization)   (None, 28, 28, 512)  2048        conv3_block12_concat[0][0]       
__________________________________________________________________________________________________
pool3_relu (Activation)         (None, 28, 28, 512)  0           pool3_bn[0][0]                   
__________________________________________________________________________________________________
pool3_conv (Conv2D)             (None, 28, 28, 256)  131072      pool3_relu[0][0]                 
__________________________________________________________________________________________________
pool3_pool (AveragePooling2D)   (None, 14, 14, 256)  0           pool3_conv[0][0]                 
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, 14, 14, 256)  1024        pool3_pool[0][0]                 
__________________________________________________________________________________________________
conv4_block1_0_relu (Activation (None, 14, 14, 256)  0           conv4_block1_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D)    (None, 14, 14, 128)  32768       conv4_block1_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, 14, 14, 128)  0           conv4_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_concat (Concatenat (None, 14, 14, 288)  0           pool3_pool[0][0]                 
                                                                 conv4_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_0_bn (BatchNormali (None, 14, 14, 288)  1152        conv4_block1_concat[0][0]        
__________________________________________________________________________________________________
conv4_block2_0_relu (Activation (None, 14, 14, 288)  0           conv4_block2_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D)    (None, 14, 14, 128)  36864       conv4_block2_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, 14, 14, 128)  0           conv4_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_concat (Concatenat (None, 14, 14, 320)  0           conv4_block1_concat[0][0]        
                                                                 conv4_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_0_bn (BatchNormali (None, 14, 14, 320)  1280        conv4_block2_concat[0][0]        
__________________________________________________________________________________________________
conv4_block3_0_relu (Activation (None, 14, 14, 320)  0           conv4_block3_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D)    (None, 14, 14, 128)  40960       conv4_block3_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, 14, 14, 128)  0           conv4_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_concat (Concatenat (None, 14, 14, 352)  0           conv4_block2_concat[0][0]        
                                                                 conv4_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_0_bn (BatchNormali (None, 14, 14, 352)  1408        conv4_block3_concat[0][0]        
__________________________________________________________________________________________________
conv4_block4_0_relu (Activation (None, 14, 14, 352)  0           conv4_block4_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D)    (None, 14, 14, 128)  45056       conv4_block4_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, 14, 14, 128)  0           conv4_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_concat (Concatenat (None, 14, 14, 384)  0           conv4_block3_concat[0][0]        
                                                                 conv4_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_0_bn (BatchNormali (None, 14, 14, 384)  1536        conv4_block4_concat[0][0]        
__________________________________________________________________________________________________
conv4_block5_0_relu (Activation (None, 14, 14, 384)  0           conv4_block5_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D)    (None, 14, 14, 128)  49152       conv4_block5_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, 14, 14, 128)  0           conv4_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_concat (Concatenat (None, 14, 14, 416)  0           conv4_block4_concat[0][0]        
                                                                 conv4_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_0_bn (BatchNormali (None, 14, 14, 416)  1664        conv4_block5_concat[0][0]        
__________________________________________________________________________________________________
conv4_block6_0_relu (Activation (None, 14, 14, 416)  0           conv4_block6_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D)    (None, 14, 14, 128)  53248       conv4_block6_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, 14, 14, 128)  0           conv4_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_concat (Concatenat (None, 14, 14, 448)  0           conv4_block5_concat[0][0]        
                                                                 conv4_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block7_0_bn (BatchNormali (None, 14, 14, 448)  1792        conv4_block6_concat[0][0]        
__________________________________________________________________________________________________
conv4_block7_0_relu (Activation (None, 14, 14, 448)  0           conv4_block7_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block7_1_conv (Conv2D)    (None, 14, 14, 128)  57344       conv4_block7_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block7_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block7_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block7_1_relu (Activation (None, 14, 14, 128)  0           conv4_block7_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block7_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block7_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block7_concat (Concatenat (None, 14, 14, 480)  0           conv4_block6_concat[0][0]        
                                                                 conv4_block7_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block8_0_bn (BatchNormali (None, 14, 14, 480)  1920        conv4_block7_concat[0][0]        
__________________________________________________________________________________________________
conv4_block8_0_relu (Activation (None, 14, 14, 480)  0           conv4_block8_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block8_1_conv (Conv2D)    (None, 14, 14, 128)  61440       conv4_block8_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block8_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block8_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block8_1_relu (Activation (None, 14, 14, 128)  0           conv4_block8_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block8_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block8_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block8_concat (Concatenat (None, 14, 14, 512)  0           conv4_block7_concat[0][0]        
                                                                 conv4_block8_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block9_0_bn (BatchNormali (None, 14, 14, 512)  2048        conv4_block8_concat[0][0]        
__________________________________________________________________________________________________
conv4_block9_0_relu (Activation (None, 14, 14, 512)  0           conv4_block9_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block9_1_conv (Conv2D)    (None, 14, 14, 128)  65536       conv4_block9_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block9_1_bn (BatchNormali (None, 14, 14, 128)  512         conv4_block9_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block9_1_relu (Activation (None, 14, 14, 128)  0           conv4_block9_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block9_2_conv (Conv2D)    (None, 14, 14, 32)   36864       conv4_block9_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block9_concat (Concatenat (None, 14, 14, 544)  0           conv4_block8_concat[0][0]        
                                                                 conv4_block9_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block10_0_bn (BatchNormal (None, 14, 14, 544)  2176        conv4_block9_concat[0][0]        
__________________________________________________________________________________________________
conv4_block10_0_relu (Activatio (None, 14, 14, 544)  0           conv4_block10_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block10_1_conv (Conv2D)   (None, 14, 14, 128)  69632       conv4_block10_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block10_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block10_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block10_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block10_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block10_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block10_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block10_concat (Concatena (None, 14, 14, 576)  0           conv4_block9_concat[0][0]        
                                                                 conv4_block10_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block11_0_bn (BatchNormal (None, 14, 14, 576)  2304        conv4_block10_concat[0][0]       
__________________________________________________________________________________________________
conv4_block11_0_relu (Activatio (None, 14, 14, 576)  0           conv4_block11_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block11_1_conv (Conv2D)   (None, 14, 14, 128)  73728       conv4_block11_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block11_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block11_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block11_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block11_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block11_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block11_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block11_concat (Concatena (None, 14, 14, 608)  0           conv4_block10_concat[0][0]       
                                                                 conv4_block11_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block12_0_bn (BatchNormal (None, 14, 14, 608)  2432        conv4_block11_concat[0][0]       
__________________________________________________________________________________________________
conv4_block12_0_relu (Activatio (None, 14, 14, 608)  0           conv4_block12_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block12_1_conv (Conv2D)   (None, 14, 14, 128)  77824       conv4_block12_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block12_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block12_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block12_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block12_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block12_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block12_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block12_concat (Concatena (None, 14, 14, 640)  0           conv4_block11_concat[0][0]       
                                                                 conv4_block12_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block13_0_bn (BatchNormal (None, 14, 14, 640)  2560        conv4_block12_concat[0][0]       
__________________________________________________________________________________________________
conv4_block13_0_relu (Activatio (None, 14, 14, 640)  0           conv4_block13_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block13_1_conv (Conv2D)   (None, 14, 14, 128)  81920       conv4_block13_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block13_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block13_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block13_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block13_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block13_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block13_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block13_concat (Concatena (None, 14, 14, 672)  0           conv4_block12_concat[0][0]       
                                                                 conv4_block13_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block14_0_bn (BatchNormal (None, 14, 14, 672)  2688        conv4_block13_concat[0][0]       
__________________________________________________________________________________________________
conv4_block14_0_relu (Activatio (None, 14, 14, 672)  0           conv4_block14_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block14_1_conv (Conv2D)   (None, 14, 14, 128)  86016       conv4_block14_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block14_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block14_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block14_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block14_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block14_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block14_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block14_concat (Concatena (None, 14, 14, 704)  0           conv4_block13_concat[0][0]       
                                                                 conv4_block14_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block15_0_bn (BatchNormal (None, 14, 14, 704)  2816        conv4_block14_concat[0][0]       
__________________________________________________________________________________________________
conv4_block15_0_relu (Activatio (None, 14, 14, 704)  0           conv4_block15_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block15_1_conv (Conv2D)   (None, 14, 14, 128)  90112       conv4_block15_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block15_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block15_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block15_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block15_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block15_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block15_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block15_concat (Concatena (None, 14, 14, 736)  0           conv4_block14_concat[0][0]       
                                                                 conv4_block15_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block16_0_bn (BatchNormal (None, 14, 14, 736)  2944        conv4_block15_concat[0][0]       
__________________________________________________________________________________________________
conv4_block16_0_relu (Activatio (None, 14, 14, 736)  0           conv4_block16_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block16_1_conv (Conv2D)   (None, 14, 14, 128)  94208       conv4_block16_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block16_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block16_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block16_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block16_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block16_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block16_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block16_concat (Concatena (None, 14, 14, 768)  0           conv4_block15_concat[0][0]       
                                                                 conv4_block16_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block17_0_bn (BatchNormal (None, 14, 14, 768)  3072        conv4_block16_concat[0][0]       
__________________________________________________________________________________________________
conv4_block17_0_relu (Activatio (None, 14, 14, 768)  0           conv4_block17_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block17_1_conv (Conv2D)   (None, 14, 14, 128)  98304       conv4_block17_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block17_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block17_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block17_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block17_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block17_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block17_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block17_concat (Concatena (None, 14, 14, 800)  0           conv4_block16_concat[0][0]       
                                                                 conv4_block17_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block18_0_bn (BatchNormal (None, 14, 14, 800)  3200        conv4_block17_concat[0][0]       
__________________________________________________________________________________________________
conv4_block18_0_relu (Activatio (None, 14, 14, 800)  0           conv4_block18_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block18_1_conv (Conv2D)   (None, 14, 14, 128)  102400      conv4_block18_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block18_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block18_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block18_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block18_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block18_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block18_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block18_concat (Concatena (None, 14, 14, 832)  0           conv4_block17_concat[0][0]       
                                                                 conv4_block18_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block19_0_bn (BatchNormal (None, 14, 14, 832)  3328        conv4_block18_concat[0][0]       
__________________________________________________________________________________________________
conv4_block19_0_relu (Activatio (None, 14, 14, 832)  0           conv4_block19_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block19_1_conv (Conv2D)   (None, 14, 14, 128)  106496      conv4_block19_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block19_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block19_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block19_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block19_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block19_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block19_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block19_concat (Concatena (None, 14, 14, 864)  0           conv4_block18_concat[0][0]       
                                                                 conv4_block19_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block20_0_bn (BatchNormal (None, 14, 14, 864)  3456        conv4_block19_concat[0][0]       
__________________________________________________________________________________________________
conv4_block20_0_relu (Activatio (None, 14, 14, 864)  0           conv4_block20_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block20_1_conv (Conv2D)   (None, 14, 14, 128)  110592      conv4_block20_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block20_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block20_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block20_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block20_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block20_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block20_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block20_concat (Concatena (None, 14, 14, 896)  0           conv4_block19_concat[0][0]       
                                                                 conv4_block20_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block21_0_bn (BatchNormal (None, 14, 14, 896)  3584        conv4_block20_concat[0][0]       
__________________________________________________________________________________________________
conv4_block21_0_relu (Activatio (None, 14, 14, 896)  0           conv4_block21_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block21_1_conv (Conv2D)   (None, 14, 14, 128)  114688      conv4_block21_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block21_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block21_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block21_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block21_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block21_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block21_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block21_concat (Concatena (None, 14, 14, 928)  0           conv4_block20_concat[0][0]       
                                                                 conv4_block21_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block22_0_bn (BatchNormal (None, 14, 14, 928)  3712        conv4_block21_concat[0][0]       
__________________________________________________________________________________________________
conv4_block22_0_relu (Activatio (None, 14, 14, 928)  0           conv4_block22_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block22_1_conv (Conv2D)   (None, 14, 14, 128)  118784      conv4_block22_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block22_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block22_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block22_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block22_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block22_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block22_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block22_concat (Concatena (None, 14, 14, 960)  0           conv4_block21_concat[0][0]       
                                                                 conv4_block22_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block23_0_bn (BatchNormal (None, 14, 14, 960)  3840        conv4_block22_concat[0][0]       
__________________________________________________________________________________________________
conv4_block23_0_relu (Activatio (None, 14, 14, 960)  0           conv4_block23_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block23_1_conv (Conv2D)   (None, 14, 14, 128)  122880      conv4_block23_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block23_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block23_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block23_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block23_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block23_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block23_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block23_concat (Concatena (None, 14, 14, 992)  0           conv4_block22_concat[0][0]       
                                                                 conv4_block23_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block24_0_bn (BatchNormal (None, 14, 14, 992)  3968        conv4_block23_concat[0][0]       
__________________________________________________________________________________________________
conv4_block24_0_relu (Activatio (None, 14, 14, 992)  0           conv4_block24_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block24_1_conv (Conv2D)   (None, 14, 14, 128)  126976      conv4_block24_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block24_1_bn (BatchNormal (None, 14, 14, 128)  512         conv4_block24_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block24_1_relu (Activatio (None, 14, 14, 128)  0           conv4_block24_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block24_2_conv (Conv2D)   (None, 14, 14, 32)   36864       conv4_block24_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block24_concat (Concatena (None, 14, 14, 1024) 0           conv4_block23_concat[0][0]       
                                                                 conv4_block24_2_conv[0][0]       
__________________________________________________________________________________________________
pool4_bn (BatchNormalization)   (None, 14, 14, 1024) 4096        conv4_block24_concat[0][0]       
__________________________________________________________________________________________________
pool4_relu (Activation)         (None, 14, 14, 1024) 0           pool4_bn[0][0]                   
__________________________________________________________________________________________________
pool4_conv (Conv2D)             (None, 14, 14, 512)  524288      pool4_relu[0][0]                 
__________________________________________________________________________________________________
pool4_pool (AveragePooling2D)   (None, 7, 7, 512)    0           pool4_conv[0][0]                 
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, 7, 7, 512)    2048        pool4_pool[0][0]                 
__________________________________________________________________________________________________
conv5_block1_0_relu (Activation (None, 7, 7, 512)    0           conv5_block1_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D)    (None, 7, 7, 128)    65536       conv5_block1_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, 7, 7, 128)    0           conv5_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_concat (Concatenat (None, 7, 7, 544)    0           pool4_pool[0][0]                 
                                                                 conv5_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_0_bn (BatchNormali (None, 7, 7, 544)    2176        conv5_block1_concat[0][0]        
__________________________________________________________________________________________________
conv5_block2_0_relu (Activation (None, 7, 7, 544)    0           conv5_block2_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D)    (None, 7, 7, 128)    69632       conv5_block2_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, 7, 7, 128)    0           conv5_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_concat (Concatenat (None, 7, 7, 576)    0           conv5_block1_concat[0][0]        
                                                                 conv5_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_0_bn (BatchNormali (None, 7, 7, 576)    2304        conv5_block2_concat[0][0]        
__________________________________________________________________________________________________
conv5_block3_0_relu (Activation (None, 7, 7, 576)    0           conv5_block3_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D)    (None, 7, 7, 128)    73728       conv5_block3_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, 7, 7, 128)    0           conv5_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_concat (Concatenat (None, 7, 7, 608)    0           conv5_block2_concat[0][0]        
                                                                 conv5_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block4_0_bn (BatchNormali (None, 7, 7, 608)    2432        conv5_block3_concat[0][0]        
__________________________________________________________________________________________________
conv5_block4_0_relu (Activation (None, 7, 7, 608)    0           conv5_block4_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block4_1_conv (Conv2D)    (None, 7, 7, 128)    77824       conv5_block4_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block4_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block4_1_relu (Activation (None, 7, 7, 128)    0           conv5_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block4_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block4_concat (Concatenat (None, 7, 7, 640)    0           conv5_block3_concat[0][0]        
                                                                 conv5_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block5_0_bn (BatchNormali (None, 7, 7, 640)    2560        conv5_block4_concat[0][0]        
__________________________________________________________________________________________________
conv5_block5_0_relu (Activation (None, 7, 7, 640)    0           conv5_block5_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block5_1_conv (Conv2D)    (None, 7, 7, 128)    81920       conv5_block5_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block5_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block5_1_relu (Activation (None, 7, 7, 128)    0           conv5_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block5_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block5_concat (Concatenat (None, 7, 7, 672)    0           conv5_block4_concat[0][0]        
                                                                 conv5_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block6_0_bn (BatchNormali (None, 7, 7, 672)    2688        conv5_block5_concat[0][0]        
__________________________________________________________________________________________________
conv5_block6_0_relu (Activation (None, 7, 7, 672)    0           conv5_block6_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block6_1_conv (Conv2D)    (None, 7, 7, 128)    86016       conv5_block6_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block6_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block6_1_relu (Activation (None, 7, 7, 128)    0           conv5_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block6_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block6_concat (Concatenat (None, 7, 7, 704)    0           conv5_block5_concat[0][0]        
                                                                 conv5_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block7_0_bn (BatchNormali (None, 7, 7, 704)    2816        conv5_block6_concat[0][0]        
__________________________________________________________________________________________________
conv5_block7_0_relu (Activation (None, 7, 7, 704)    0           conv5_block7_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block7_1_conv (Conv2D)    (None, 7, 7, 128)    90112       conv5_block7_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block7_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block7_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block7_1_relu (Activation (None, 7, 7, 128)    0           conv5_block7_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block7_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block7_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block7_concat (Concatenat (None, 7, 7, 736)    0           conv5_block6_concat[0][0]        
                                                                 conv5_block7_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block8_0_bn (BatchNormali (None, 7, 7, 736)    2944        conv5_block7_concat[0][0]        
__________________________________________________________________________________________________
conv5_block8_0_relu (Activation (None, 7, 7, 736)    0           conv5_block8_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block8_1_conv (Conv2D)    (None, 7, 7, 128)    94208       conv5_block8_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block8_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block8_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block8_1_relu (Activation (None, 7, 7, 128)    0           conv5_block8_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block8_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block8_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block8_concat (Concatenat (None, 7, 7, 768)    0           conv5_block7_concat[0][0]        
                                                                 conv5_block8_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block9_0_bn (BatchNormali (None, 7, 7, 768)    3072        conv5_block8_concat[0][0]        
__________________________________________________________________________________________________
conv5_block9_0_relu (Activation (None, 7, 7, 768)    0           conv5_block9_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block9_1_conv (Conv2D)    (None, 7, 7, 128)    98304       conv5_block9_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block9_1_bn (BatchNormali (None, 7, 7, 128)    512         conv5_block9_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block9_1_relu (Activation (None, 7, 7, 128)    0           conv5_block9_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block9_2_conv (Conv2D)    (None, 7, 7, 32)     36864       conv5_block9_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block9_concat (Concatenat (None, 7, 7, 800)    0           conv5_block8_concat[0][0]        
                                                                 conv5_block9_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block10_0_bn (BatchNormal (None, 7, 7, 800)    3200        conv5_block9_concat[0][0]        
__________________________________________________________________________________________________
conv5_block10_0_relu (Activatio (None, 7, 7, 800)    0           conv5_block10_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block10_1_conv (Conv2D)   (None, 7, 7, 128)    102400      conv5_block10_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block10_1_bn (BatchNormal (None, 7, 7, 128)    512         conv5_block10_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block10_1_relu (Activatio (None, 7, 7, 128)    0           conv5_block10_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block10_2_conv (Conv2D)   (None, 7, 7, 32)     36864       conv5_block10_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block10_concat (Concatena (None, 7, 7, 832)    0           conv5_block9_concat[0][0]        
                                                                 conv5_block10_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block11_0_bn (BatchNormal (None, 7, 7, 832)    3328        conv5_block10_concat[0][0]       
__________________________________________________________________________________________________
conv5_block11_0_relu (Activatio (None, 7, 7, 832)    0           conv5_block11_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block11_1_conv (Conv2D)   (None, 7, 7, 128)    106496      conv5_block11_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block11_1_bn (BatchNormal (None, 7, 7, 128)    512         conv5_block11_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block11_1_relu (Activatio (None, 7, 7, 128)    0           conv5_block11_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block11_2_conv (Conv2D)   (None, 7, 7, 32)     36864       conv5_block11_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block11_concat (Concatena (None, 7, 7, 864)    0           conv5_block10_concat[0][0]       
                                                                 conv5_block11_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block12_0_bn (BatchNormal (None, 7, 7, 864)    3456        conv5_block11_concat[0][0]       
__________________________________________________________________________________________________
conv5_block12_0_relu (Activatio (None, 7, 7, 864)    0           conv5_block12_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block12_1_conv (Conv2D)   (None, 7, 7, 128)    110592      conv5_block12_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block12_1_bn (BatchNormal (None, 7, 7, 128)    512         conv5_block12_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block12_1_relu (Activatio (None, 7, 7, 128)    0           conv5_block12_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block12_2_conv (Conv2D)   (None, 7, 7, 32)     36864       conv5_block12_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block12_concat (Concatena (None, 7, 7, 896)    0           conv5_block11_concat[0][0]       
                                                                 conv5_block12_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block13_0_bn (BatchNormal (None, 7, 7, 896)    3584        conv5_block12_concat[0][0]       
__________________________________________________________________________________________________
conv5_block13_0_relu (Activatio (None, 7, 7, 896)    0           conv5_block13_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block13_1_conv (Conv2D)   (None, 7, 7, 128)    114688      conv5_block13_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block13_1_bn (BatchNormal (None, 7, 7, 128)    512         conv5_block13_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block13_1_relu (Activatio (None, 7, 7, 128)    0           conv5_block13_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block13_2_conv (Conv2D)   (None, 7, 7, 32)     36864       conv5_block13_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block13_concat (Concatena (None, 7, 7, 928)    0           conv5_block12_concat[0][0]       
                                                                 conv5_block13_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block14_0_bn (BatchNormal (None, 7, 7, 928)    3712        conv5_block13_concat[0][0]       
__________________________________________________________________________________________________
conv5_block14_0_relu (Activatio (None, 7, 7, 928)    0           conv5_block14_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block14_1_conv (Conv2D)   (None, 7, 7, 128)    118784      conv5_block14_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block14_1_bn (BatchNormal (None, 7, 7, 128)    512         conv5_block14_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block14_1_relu (Activatio (None, 7, 7, 128)    0           conv5_block14_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block14_2_conv (Conv2D)   (None, 7, 7, 32)     36864       conv5_block14_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block14_concat (Concatena (None, 7, 7, 960)    0           conv5_block13_concat[0][0]       
                                                                 conv5_block14_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block15_0_bn (BatchNormal (None, 7, 7, 960)    3840        conv5_block14_concat[0][0]       
__________________________________________________________________________________________________
conv5_block15_0_relu (Activatio (None, 7, 7, 960)    0           conv5_block15_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block15_1_conv (Conv2D)   (None, 7, 7, 128)    122880      conv5_block15_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block15_1_bn (BatchNormal (None, 7, 7, 128)    512         conv5_block15_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block15_1_relu (Activatio (None, 7, 7, 128)    0           conv5_block15_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block15_2_conv (Conv2D)   (None, 7, 7, 32)     36864       conv5_block15_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block15_concat (Concatena (None, 7, 7, 992)    0           conv5_block14_concat[0][0]       
                                                                 conv5_block15_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block16_0_bn (BatchNormal (None, 7, 7, 992)    3968        conv5_block15_concat[0][0]       
__________________________________________________________________________________________________
conv5_block16_0_relu (Activatio (None, 7, 7, 992)    0           conv5_block16_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block16_1_conv (Conv2D)   (None, 7, 7, 128)    126976      conv5_block16_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block16_1_bn (BatchNormal (None, 7, 7, 128)    512         conv5_block16_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block16_1_relu (Activatio (None, 7, 7, 128)    0           conv5_block16_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block16_2_conv (Conv2D)   (None, 7, 7, 32)     36864       conv5_block16_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block16_concat (Concatena (None, 7, 7, 1024)   0           conv5_block15_concat[0][0]       
                                                                 conv5_block16_2_conv[0][0]       
__________________________________________________________________________________________________
bn (BatchNormalization)         (None, 7, 7, 1024)   4096        conv5_block16_concat[0][0]       
__________________________________________________________________________________________________
relu (Activation)               (None, 7, 7, 1024)   0           bn[0][0]                         
__________________________________________________________________________________________________
avg_pool (GlobalAveragePooling2 (None, 1024)         0           relu[0][0]                       
__________________________________________________________________________________________________
fc1000 (Dense)                  (None, 1000)         1025000     avg_pool[0][0]                   
==================================================================================================
Total params: 8,062,504
Trainable params: 7,978,856
Non-trainable params: 83,648
__________________________________________________________________________________________________
In [5]:
def load_pretrained_model(pretrained_model=DenseNet121, weights='imagenet', transfer_layer='conv5_block16_1_conv', 
                          transfer_layer_idx=420):
    
    """
    Return pre-trained model with defined weights and defined transfer layer.
    
    Args:
        pretrained model (Model): the pretrained model.
        weights (str): weights, which will be loaded.
        transfer_layer (str): name of the layer from which the weights will be learned during training.
        transfer_layer_idx (int): index of the previously specified transfer layer.
        
    Returns:
        pretrained model (Model): the pretrained model. 
    """
    
    model = pretrained_model(include_top=True, weights=weights)
    transfer_layer = model.get_layer(transfer_layer)
    pretrained_model = Model(inputs = model.input, outputs = model.output) 
    
    # freezing layers till transfer_layer
    for layer in pretrained_model.layers[0:transfer_layer_idx]:
        layer.trainable = False
    
    # rechecking, if the freezeing was performed correctly
    print('Pre-trained model layers and their trainability')
    for layer in pretrained_model.layers:
        print(layer.name, layer.trainable)
    
    return pretrained_model


def build_model(pretrained_model, lr=0.0001, loss = 'binary_crossentropy', metrics = ['binary_accuracy'], dropout=0.5):
    
    """
    Function builing a model by attaching layers after the pre-trained model.
    
    Args:
        pretrained model (Model): the pretrained model.
        lr (Float): learning rate.
        loss (str): loss function.
        metrics (List): metrics to be monitored during training.
        dropout (Float): fraction of neurons randomly swiching off during training.
    
    Returns:
    model (Model): model ready for training 
    """
    
    model = Sequential()
    model.add(pretrained_model)
    model.add(Dense(500, activation='relu'))
    model.add(Dropout(dropout))
    model.add(Dense(100, activation='relu'))
    model.add(Dropout(dropout))
    model.add(Dense(50, activation='relu'))
    model.add(Dense(1, activation='sigmoid'))
    
    optimizer = Adam(lr=lr)
    loss = loss
    metrics = metrics 
    
    model.compile(optimizer=optimizer, loss=loss, metrics=metrics)
        
    return model


def build_simpler_model(pretrained_model, lr=0.0001, loss = 'binary_crossentropy', metrics = ['binary_accuracy']):
    
    """
    Function builing a model by attaching single output neuron after the pre-trained model.
    
    Args:
        pretrained model (Model): the pretrained model.
        lr (Float): learning rate.
        loss (str): loss function.
        metrics (List): metrics to be monitored during training.
        dropout (Float): fraction of neurons randomly swiching off during training.
    
    Returns:
    model (Model): model ready for training 
    """
    
    model = Sequential()
    model.add(pretrained_model)
    model.add(Dense(1, activation='sigmoid'))
    
    optimizer = Adam(lr=lr)
    loss = loss
    metrics = metrics 
    
    model.compile(optimizer=optimizer, loss=loss, metrics=metrics)
        
    return model



def build_model_no_dropout(pretrained_model, lr=0.0001, loss = 'binary_crossentropy', metrics = ['binary_accuracy']):
    
    """
    Function builing a model by attaching layers after the pre-trained model without dropout layers.
    
    Args:
        pretrained model (Model): the pretrained model.
        lr (Float): learning rate.
        loss (str): loss function.
        metrics (List): metrics to be monitored during training.
    
    Returns:
    model (Model): model ready for training 
    """
    
    model = Sequential()
    model.add(pretrained_model)
    model.add(Dense(500, activation='relu'))
    model.add(Dense(100, activation='relu'))
    model.add(Dense(50, activation='relu'))
    model.add(Dense(1, activation='sigmoid'))
    
    optimizer = Adam(lr=lr)
    loss = loss
    metrics = metrics 
    
    model.compile(optimizer=optimizer, loss=loss, metrics=metrics)
        
    return model

def save_model(model, model_name):
    
    """
    Function saving a model to .json file.
    
    Args:
        model (Model): model to be saved.
        model_name (str): name under which the model will be saved.
    """
    
    model_json = model.to_json()
    filename = str(model_name) + '.json'
    with open(filename, 'w') as json_file:
              json_file.write(model_json)
    print(f'Saved model architecture under {filename}')

Fuctions for model training and evaluation

In [15]:
def plot_history(history_df):   
    """
    Function plotting the training history.
    
    Args:
        history_df (dataframe): dataframe containing training history
    """
    
    N = len(history_df)
    fig, ax = plt.subplots(1,2, figsize=(15,6))
    
    ax[0].plot(np.arange(0, N), history_df["loss"], label="train loss", color='blue')
    ax[0].plot(np.arange(0, N), history_df["val_loss"], label="valid loss", color='red')
    ax[0].set_title("Training and validation loss")
    ax[0].set_xlabel("# of epoch")
    ax[0].set_ylabel("Loss")
    ax[0].legend()
    
    ax[1].plot(np.arange(0, N), history_df["binary_accuracy"], label="train acc", color='blue')
    ax[1].plot(np.arange(0, N), history_df["val_binary_accuracy"], label="valid acc", color='red')
    ax[1].set_title("Training and validation accuracy")
    ax[1].set_xlabel("# of epoch")
    ax[1].set_ylabel("Accuracy")
    ax[1].legend()
    
    fig.tight_layout(pad=3.0)
    plt.show()


def train_model(model, model_name, train_gen, valid_gen, epochs=50, 
                                   monitor='val_loss', mode='min', patience=15):
    """
    Function training the model with ModelCheckpoint and EarlyStopping, 
    saving training history in form of dataframe to .pkl file and plotting the history.
    
    Args:
        model (Model): model to be saved.
        model_name (str): name under which the training history will be saved.
        train_gen (DataFrameIterator): iterator over training set.
        valid_gen (DataFrameIterator): iterator over validation set.
        epochs (int): Number of epochs to train the model. 
        monitor (str): metrics to be monitored.
        mode (one of {'auto', 'min', 'max'}): the decision to overwrite the current save file 
                                            is made based on either the maximization or the minimization 
                                            of the monitored quantity.
        patience (int): number of training epochs without model improvement before quitting.
        
    Returns:
    history (model.history): history of model training
    """
    
    
    # set callbacks
    checkpoint = ModelCheckpoint(str(model_name + '.best.hdf5'), 
                             monitor = monitor, 
                             verbose = 1, 
                             save_best_only = True, 
                             mode = mode, 
                             save_weights_only = True)
    
    early = EarlyStopping(monitor = monitor, 
                      mode = mode, 
                      patience=patience)
    
    callbacks_list = [checkpoint, early]

            
    # get fit_generator
    history = model.fit_generator(train_gen,validation_data = valid_gen,
                                            epochs = epochs,
                                            callbacks = callbacks_list)
    
    # create dataframe containing training history and save it to .pkl
    history_df = pd.DataFrame(history.history)
    filename = str(model_name + '_history.pkl')
    history_df.to_pickle(filename)
    
    
    # plot history
    plot_history(history_df)
    
    return history
In [7]:
def plot_prediction_distribution(pred_array):
    """
    Function plotting the distribution of predictions.
    
    Args:
        pred_array (numpy array): array containing predictions
    """
    plt.figure(figsize=(6,4))
    plt.hist(pred_array, bins=50)
    plt.title('Distribution of model predictions')
    plt.xlabel('Prediction: probability of pneumonia')
    plt.ylabel('# number of scans')
    plt.show()
    print(f'Model prediction min: {pred_array.min():.3}')
    print(f'Model prediction max: {pred_array.max():.3}')
    

def plot_AUROC(GT_array, pred_array):
    """
    Function plotting ROC curve and displaying AUC.
    
    Args:
        GT_array (numpy array): arry containing ground truth.
        pred_array (numpy array): arry containing predictions.
        
    Returns:
        fpr (numpy array): array containing false postive rates for different treshold values.
        tpr (numpy array): array containing true postive rates for different treshold values.
        tresholds (numpy array): array containing treshold values used for generation of the ROC curve.
    """    
    
    plt.figure(figsize=(6,4))
    fpr, tpr, thresholds = roc_curve(GT_array, pred_array)
    plt.plot(fpr, tpr, label = '%s (AUC: %0.2f)'  % ('Pneumonia', auc(fpr, tpr)))
    plt.xlabel('False Positive Rate')
    plt.ylabel('True Positive Rate')
    plt.plot([0, 1], [0, 1], linestyle='--', lw=2, color='black',label='Random choice')
    plt.legend()
    plt.show()
    
    return fpr, tpr, thresholds
    

def plot_precision_recall_curve(GT_array, pred_array):
    """
    Function plotting precision-recall curve and displaying average precision score.
    
    Args:
        GT_array (numpy array): arry containing ground truth.
        pred_array (numpy array): arry containing predictions.
        
    Returns:
        precision (numpy array): array containing precision values for different treshold values.
        recall (numpy array): array containing recall for different treshold values.
        tresholds (numpy array): array containing treshold values used for generation of the precision-recall curve.
    """
    plt.figure(figsize=(6,4))
    precision, recall, thresholds = precision_recall_curve(GT_array, pred_array)
    plt.plot(recall, precision, label = '%s (AP Score:%0.2f)'  % ('Pneumonia', average_precision_score(GT_array,pred_array)))
    plt.xlabel('Recall')
    plt.ylabel('Precision')
    plt.legend()
    plt.show()
    
    return precision, recall, thresholds


def  calc_f1(prec,recall):
    """
    Function calculating F1 score.
    
    Args:
        prec (float): precision value.
        recall (float): recall value.
       
    Returns:
        2*(prec*recall)/(prec+recall) (float): F1-score
    """
    
    return 2*(prec*recall)/(prec+recall)


def plot_f1_tresh(GT_array, pred_array):
    """
    Function plotting f1 score vs treshold.
    
    Args:
        GT_array (numpy array): arry containing ground truth.
        pred_array (numpy array): arry containing predictions.
        
    Returns:
        f1 (numpy array): array containing f1 score for different treshold values.
        tresholds (numpy array): array containing treshold values used for generation of the precision-recall curve.
    """
    
    precision, recall, thresholds = precision_recall_curve(GT_array, pred_array)
    f1 = calc_f1(precision, recall)
    plt.figure(figsize=(6,4))
    plt.plot(f1[:-1], thresholds)
    plt.xlabel('F1 score')
    plt.ylabel('Treshold')
    plt.show()
    
    return f1, thresholds


def make_evaluation_df(GT_array, pred_array, treshold):
    """
    Function returing dataframe containing ground truth, model prediction and class based on chosen treshold.
    
    Args:
        GT_array (numpy array): arry containing ground truth.
        pred_array (numpy array): arry containing predictions.
        treshold(float): value of chosen treshold
        
    Returns:
        evaluation_df (dataframe): dataframe containing ground truth, model prediction and class based on chosen treshold.
    """    
    
    evaluation_df = pd.DataFrame(GT_array, columns=['Ground_truth'])
    evaluation_df['Pred'] = pred_array
    evaluation_df['Pred_tresh'] = evaluation_df['Pred'] >= treshold
    evaluation_df['Pred_tresh'] = evaluation_df['Pred_tresh'].replace(True, 1).replace(False, 0)
    display(evaluation_df.head())
    
    return evaluation_df
In [8]:
def predict_and_evaluate_model(model, model_name, test_gen, steps):
    """
    Function predicting labels, plotting prediction distribution,
    AUROC curve and precision recall curve.
    Function also saves model predictions and ground truth in as .npy files.
    
    Args:
        model (Model): model, which will be used for generation of predictions.
        model_name (str): prefix under which the prediction array and the ground truth array will be saved.
        test_gen (DataFrameIterator): iterator over test set.
        steps (int):  Total number of steps (batches of samples) before declaring the prediction round finished.
       
    Returns:
        GT_array (numpy array): arry containing ground truth.
        pred_array (numpy array): arry containing predictions.
        evaluation_dic (dict): dictonary containing values from AUROC and precision-recall curves.
    """    
    
    # generate predictions
    pred_Y = model.predict_generator(test_gen, steps=steps, verbose = True)
    
    # get ground truth labels
    ground_truth = test_gen.labels
    
    # save prediction and ground truth as .npy files.
    np.save(model_name + '_pred_Y.npy', pred_Y)
    np.save(model_name + '_GT.numpy', test_gen.labels)
    
    # plot prediction distribution, AUROC and precision-recall curve
    plot_prediction_distribution(pred_Y)
    fpr, tpr, thresholds_auroc = plot_AUROC(ground_truth, pred_Y)
    precision, recall, thresholds_pr = plot_precision_recall_curve(ground_truth, pred_Y)
    
    # create distonary containing values from AUROC and precision-recall curves
    evaluation_dic = {'fpr': fpr, 
                      'tpr' : tpr,
                      'thresholds_auroc': thresholds_auroc,
                      'precision': precision,
                      'recall': recall,
                      'thresholds_pr': thresholds_pr}
    
    return pred_Y, ground_truth, evaluation_dic

Initialization of training, validation and test generators

In [9]:
# Initializing training generator
train_idg = image_generator_train()
train_gen = get_generator_train(train_idg, train_df)


# Looking at some examples of our training data to understand the extent to which data is being augumented prior to training
train_x, train_y = next(train_gen)
fig, m_axs = plt.subplots(4, 4, figsize = (16, 16))

for (image, label, plot) in zip(train_x, train_y, m_axs.flatten()):
    plot.imshow(image[:,:,0], cmap = 'bone')
    if label == 1: 
        plot.set_title('Pneumonia')
    else:
        plot.set_title('No Pneumonia')
    plot.axis('off')
Found 2140 validated image filenames belonging to 2 classes.
In [30]:
# Accessing the order of the classes in training generator
train_gen.class_indices
Out[30]:
{'Negative': 0, 'Positive': 1}
In [10]:
# Initializing validation and test generators
valid_gen, test_gen = get_test_valid_generator()
Found 2140 validated image filenames belonging to 2 classes.
Mean of 100 images from the training set: 123.9
Std of 100 images from the training set: 63.04
Found 174 validated image filenames belonging to 2 classes.
Found 1370 validated image filenames belonging to 2 classes.

Training models based on DenseNet121:

Model 1

In [4]:
# Model 1: 
## transfer layer: conv5_block16_1_conv (idx 420)
## learning rate E-4
## dropout 0.2
model_1_pretrained = load_pretrained_model(pretrained_model=DenseNet121)
model_1 = build_model(model_1_pretrained, dropout=0.2)
Downloading data from https://github.com/keras-team/keras-applications/releases/download/densenet/densenet121_weights_tf_dim_ordering_tf_kernels.h5
33193984/33188688 [==============================] - 1s 0us/step
Pre-trained model layers and their trainability
input_1 False
zero_padding2d_1 False
conv1/conv False
conv1/bn False
conv1/relu False
zero_padding2d_2 False
pool1 False
conv2_block1_0_bn False
conv2_block1_0_relu False
conv2_block1_1_conv False
conv2_block1_1_bn False
conv2_block1_1_relu False
conv2_block1_2_conv False
conv2_block1_concat False
conv2_block2_0_bn False
conv2_block2_0_relu False
conv2_block2_1_conv False
conv2_block2_1_bn False
conv2_block2_1_relu False
conv2_block2_2_conv False
conv2_block2_concat False
conv2_block3_0_bn False
conv2_block3_0_relu False
conv2_block3_1_conv False
conv2_block3_1_bn False
conv2_block3_1_relu False
conv2_block3_2_conv False
conv2_block3_concat False
conv2_block4_0_bn False
conv2_block4_0_relu False
conv2_block4_1_conv False
conv2_block4_1_bn False
conv2_block4_1_relu False
conv2_block4_2_conv False
conv2_block4_concat False
conv2_block5_0_bn False
conv2_block5_0_relu False
conv2_block5_1_conv False
conv2_block5_1_bn False
conv2_block5_1_relu False
conv2_block5_2_conv False
conv2_block5_concat False
conv2_block6_0_bn False
conv2_block6_0_relu False
conv2_block6_1_conv False
conv2_block6_1_bn False
conv2_block6_1_relu False
conv2_block6_2_conv False
conv2_block6_concat False
pool2_bn False
pool2_relu False
pool2_conv False
pool2_pool False
conv3_block1_0_bn False
conv3_block1_0_relu False
conv3_block1_1_conv False
conv3_block1_1_bn False
conv3_block1_1_relu False
conv3_block1_2_conv False
conv3_block1_concat False
conv3_block2_0_bn False
conv3_block2_0_relu False
conv3_block2_1_conv False
conv3_block2_1_bn False
conv3_block2_1_relu False
conv3_block2_2_conv False
conv3_block2_concat False
conv3_block3_0_bn False
conv3_block3_0_relu False
conv3_block3_1_conv False
conv3_block3_1_bn False
conv3_block3_1_relu False
conv3_block3_2_conv False
conv3_block3_concat False
conv3_block4_0_bn False
conv3_block4_0_relu False
conv3_block4_1_conv False
conv3_block4_1_bn False
conv3_block4_1_relu False
conv3_block4_2_conv False
conv3_block4_concat False
conv3_block5_0_bn False
conv3_block5_0_relu False
conv3_block5_1_conv False
conv3_block5_1_bn False
conv3_block5_1_relu False
conv3_block5_2_conv False
conv3_block5_concat False
conv3_block6_0_bn False
conv3_block6_0_relu False
conv3_block6_1_conv False
conv3_block6_1_bn False
conv3_block6_1_relu False
conv3_block6_2_conv False
conv3_block6_concat False
conv3_block7_0_bn False
conv3_block7_0_relu False
conv3_block7_1_conv False
conv3_block7_1_bn False
conv3_block7_1_relu False
conv3_block7_2_conv False
conv3_block7_concat False
conv3_block8_0_bn False
conv3_block8_0_relu False
conv3_block8_1_conv False
conv3_block8_1_bn False
conv3_block8_1_relu False
conv3_block8_2_conv False
conv3_block8_concat False
conv3_block9_0_bn False
conv3_block9_0_relu False
conv3_block9_1_conv False
conv3_block9_1_bn False
conv3_block9_1_relu False
conv3_block9_2_conv False
conv3_block9_concat False
conv3_block10_0_bn False
conv3_block10_0_relu False
conv3_block10_1_conv False
conv3_block10_1_bn False
conv3_block10_1_relu False
conv3_block10_2_conv False
conv3_block10_concat False
conv3_block11_0_bn False
conv3_block11_0_relu False
conv3_block11_1_conv False
conv3_block11_1_bn False
conv3_block11_1_relu False
conv3_block11_2_conv False
conv3_block11_concat False
conv3_block12_0_bn False
conv3_block12_0_relu False
conv3_block12_1_conv False
conv3_block12_1_bn False
conv3_block12_1_relu False
conv3_block12_2_conv False
conv3_block12_concat False
pool3_bn False
pool3_relu False
pool3_conv False
pool3_pool False
conv4_block1_0_bn False
conv4_block1_0_relu False
conv4_block1_1_conv False
conv4_block1_1_bn False
conv4_block1_1_relu False
conv4_block1_2_conv False
conv4_block1_concat False
conv4_block2_0_bn False
conv4_block2_0_relu False
conv4_block2_1_conv False
conv4_block2_1_bn False
conv4_block2_1_relu False
conv4_block2_2_conv False
conv4_block2_concat False
conv4_block3_0_bn False
conv4_block3_0_relu False
conv4_block3_1_conv False
conv4_block3_1_bn False
conv4_block3_1_relu False
conv4_block3_2_conv False
conv4_block3_concat False
conv4_block4_0_bn False
conv4_block4_0_relu False
conv4_block4_1_conv False
conv4_block4_1_bn False
conv4_block4_1_relu False
conv4_block4_2_conv False
conv4_block4_concat False
conv4_block5_0_bn False
conv4_block5_0_relu False
conv4_block5_1_conv False
conv4_block5_1_bn False
conv4_block5_1_relu False
conv4_block5_2_conv False
conv4_block5_concat False
conv4_block6_0_bn False
conv4_block6_0_relu False
conv4_block6_1_conv False
conv4_block6_1_bn False
conv4_block6_1_relu False
conv4_block6_2_conv False
conv4_block6_concat False
conv4_block7_0_bn False
conv4_block7_0_relu False
conv4_block7_1_conv False
conv4_block7_1_bn False
conv4_block7_1_relu False
conv4_block7_2_conv False
conv4_block7_concat False
conv4_block8_0_bn False
conv4_block8_0_relu False
conv4_block8_1_conv False
conv4_block8_1_bn False
conv4_block8_1_relu False
conv4_block8_2_conv False
conv4_block8_concat False
conv4_block9_0_bn False
conv4_block9_0_relu False
conv4_block9_1_conv False
conv4_block9_1_bn False
conv4_block9_1_relu False
conv4_block9_2_conv False
conv4_block9_concat False
conv4_block10_0_bn False
conv4_block10_0_relu False
conv4_block10_1_conv False
conv4_block10_1_bn False
conv4_block10_1_relu False
conv4_block10_2_conv False
conv4_block10_concat False
conv4_block11_0_bn False
conv4_block11_0_relu False
conv4_block11_1_conv False
conv4_block11_1_bn False
conv4_block11_1_relu False
conv4_block11_2_conv False
conv4_block11_concat False
conv4_block12_0_bn False
conv4_block12_0_relu False
conv4_block12_1_conv False
conv4_block12_1_bn False
conv4_block12_1_relu False
conv4_block12_2_conv False
conv4_block12_concat False
conv4_block13_0_bn False
conv4_block13_0_relu False
conv4_block13_1_conv False
conv4_block13_1_bn False
conv4_block13_1_relu False
conv4_block13_2_conv False
conv4_block13_concat False
conv4_block14_0_bn False
conv4_block14_0_relu False
conv4_block14_1_conv False
conv4_block14_1_bn False
conv4_block14_1_relu False
conv4_block14_2_conv False
conv4_block14_concat False
conv4_block15_0_bn False
conv4_block15_0_relu False
conv4_block15_1_conv False
conv4_block15_1_bn False
conv4_block15_1_relu False
conv4_block15_2_conv False
conv4_block15_concat False
conv4_block16_0_bn False
conv4_block16_0_relu False
conv4_block16_1_conv False
conv4_block16_1_bn False
conv4_block16_1_relu False
conv4_block16_2_conv False
conv4_block16_concat False
conv4_block17_0_bn False
conv4_block17_0_relu False
conv4_block17_1_conv False
conv4_block17_1_bn False
conv4_block17_1_relu False
conv4_block17_2_conv False
conv4_block17_concat False
conv4_block18_0_bn False
conv4_block18_0_relu False
conv4_block18_1_conv False
conv4_block18_1_bn False
conv4_block18_1_relu False
conv4_block18_2_conv False
conv4_block18_concat False
conv4_block19_0_bn False
conv4_block19_0_relu False
conv4_block19_1_conv False
conv4_block19_1_bn False
conv4_block19_1_relu False
conv4_block19_2_conv False
conv4_block19_concat False
conv4_block20_0_bn False
conv4_block20_0_relu False
conv4_block20_1_conv False
conv4_block20_1_bn False
conv4_block20_1_relu False
conv4_block20_2_conv False
conv4_block20_concat False
conv4_block21_0_bn False
conv4_block21_0_relu False
conv4_block21_1_conv False
conv4_block21_1_bn False
conv4_block21_1_relu False
conv4_block21_2_conv False
conv4_block21_concat False
conv4_block22_0_bn False
conv4_block22_0_relu False
conv4_block22_1_conv False
conv4_block22_1_bn False
conv4_block22_1_relu False
conv4_block22_2_conv False
conv4_block22_concat False
conv4_block23_0_bn False
conv4_block23_0_relu False
conv4_block23_1_conv False
conv4_block23_1_bn False
conv4_block23_1_relu False
conv4_block23_2_conv False
conv4_block23_concat False
conv4_block24_0_bn False
conv4_block24_0_relu False
conv4_block24_1_conv False
conv4_block24_1_bn False
conv4_block24_1_relu False
conv4_block24_2_conv False
conv4_block24_concat False
pool4_bn False
pool4_relu False
pool4_conv False
pool4_pool False
conv5_block1_0_bn False
conv5_block1_0_relu False
conv5_block1_1_conv False
conv5_block1_1_bn False
conv5_block1_1_relu False
conv5_block1_2_conv False
conv5_block1_concat False
conv5_block2_0_bn False
conv5_block2_0_relu False
conv5_block2_1_conv False
conv5_block2_1_bn False
conv5_block2_1_relu False
conv5_block2_2_conv False
conv5_block2_concat False
conv5_block3_0_bn False
conv5_block3_0_relu False
conv5_block3_1_conv False
conv5_block3_1_bn False
conv5_block3_1_relu False
conv5_block3_2_conv False
conv5_block3_concat False
conv5_block4_0_bn False
conv5_block4_0_relu False
conv5_block4_1_conv False
conv5_block4_1_bn False
conv5_block4_1_relu False
conv5_block4_2_conv False
conv5_block4_concat False
conv5_block5_0_bn False
conv5_block5_0_relu False
conv5_block5_1_conv False
conv5_block5_1_bn False
conv5_block5_1_relu False
conv5_block5_2_conv False
conv5_block5_concat False
conv5_block6_0_bn False
conv5_block6_0_relu False
conv5_block6_1_conv False
conv5_block6_1_bn False
conv5_block6_1_relu False
conv5_block6_2_conv False
conv5_block6_concat False
conv5_block7_0_bn False
conv5_block7_0_relu False
conv5_block7_1_conv False
conv5_block7_1_bn False
conv5_block7_1_relu False
conv5_block7_2_conv False
conv5_block7_concat False
conv5_block8_0_bn False
conv5_block8_0_relu False
conv5_block8_1_conv False
conv5_block8_1_bn False
conv5_block8_1_relu False
conv5_block8_2_conv False
conv5_block8_concat False
conv5_block9_0_bn False
conv5_block9_0_relu False
conv5_block9_1_conv False
conv5_block9_1_bn False
conv5_block9_1_relu False
conv5_block9_2_conv False
conv5_block9_concat False
conv5_block10_0_bn False
conv5_block10_0_relu False
conv5_block10_1_conv False
conv5_block10_1_bn False
conv5_block10_1_relu False
conv5_block10_2_conv False
conv5_block10_concat False
conv5_block11_0_bn False
conv5_block11_0_relu False
conv5_block11_1_conv False
conv5_block11_1_bn False
conv5_block11_1_relu False
conv5_block11_2_conv False
conv5_block11_concat False
conv5_block12_0_bn False
conv5_block12_0_relu False
conv5_block12_1_conv False
conv5_block12_1_bn False
conv5_block12_1_relu False
conv5_block12_2_conv False
conv5_block12_concat False
conv5_block13_0_bn False
conv5_block13_0_relu False
conv5_block13_1_conv False
conv5_block13_1_bn False
conv5_block13_1_relu False
conv5_block13_2_conv False
conv5_block13_concat False
conv5_block14_0_bn False
conv5_block14_0_relu False
conv5_block14_1_conv False
conv5_block14_1_bn False
conv5_block14_1_relu False
conv5_block14_2_conv False
conv5_block14_concat False
conv5_block15_0_bn False
conv5_block15_0_relu False
conv5_block15_1_conv False
conv5_block15_1_bn False
conv5_block15_1_relu False
conv5_block15_2_conv False
conv5_block15_concat False
conv5_block16_0_bn False
conv5_block16_0_relu False
conv5_block16_1_conv True
conv5_block16_1_bn True
conv5_block16_1_relu True
conv5_block16_2_conv True
conv5_block16_concat True
bn True
relu True
avg_pool True
fc1000 True
In [11]:
save_model(model_1, 'model_1')
Saved model architecture under model_1.json
In [22]:
model_1_hist = train_model(model_1, 'model_1', train_gen, valid_gen, epochs=100)
Epoch 1/100
34/34 [==============================] - 61s 2s/step - loss: 0.6867 - binary_accuracy: 0.6112 - val_loss: 0.6805 - val_binary_accuracy: 0.5172

Epoch 00001: val_loss improved from inf to 0.68054, saving model to model_1.best.hdf5
Epoch 2/100
34/34 [==============================] - 59s 2s/step - loss: 0.6733 - binary_accuracy: 0.6243 - val_loss: 0.6664 - val_binary_accuracy: 0.5575

Epoch 00002: val_loss improved from 0.68054 to 0.66641, saving model to model_1.best.hdf5
Epoch 3/100
34/34 [==============================] - 60s 2s/step - loss: 0.6518 - binary_accuracy: 0.6463 - val_loss: 0.6448 - val_binary_accuracy: 0.6494

Epoch 00003: val_loss improved from 0.66641 to 0.64481, saving model to model_1.best.hdf5
Epoch 4/100
34/34 [==============================] - 63s 2s/step - loss: 0.6349 - binary_accuracy: 0.6584 - val_loss: 0.6287 - val_binary_accuracy: 0.6322

Epoch 00004: val_loss improved from 0.64481 to 0.62870, saving model to model_1.best.hdf5
Epoch 5/100
34/34 [==============================] - 63s 2s/step - loss: 0.6307 - binary_accuracy: 0.6664 - val_loss: 0.6204 - val_binary_accuracy: 0.6379

Epoch 00005: val_loss improved from 0.62870 to 0.62037, saving model to model_1.best.hdf5
Epoch 6/100
34/34 [==============================] - 61s 2s/step - loss: 0.6208 - binary_accuracy: 0.6766 - val_loss: 0.6144 - val_binary_accuracy: 0.6207

Epoch 00006: val_loss improved from 0.62037 to 0.61436, saving model to model_1.best.hdf5
Epoch 7/100
34/34 [==============================] - 61s 2s/step - loss: 0.6185 - binary_accuracy: 0.6710 - val_loss: 0.6119 - val_binary_accuracy: 0.6494

Epoch 00007: val_loss improved from 0.61436 to 0.61188, saving model to model_1.best.hdf5
Epoch 8/100
34/34 [==============================] - 62s 2s/step - loss: 0.6130 - binary_accuracy: 0.6776 - val_loss: 0.6074 - val_binary_accuracy: 0.6322

Epoch 00008: val_loss improved from 0.61188 to 0.60738, saving model to model_1.best.hdf5
Epoch 9/100
34/34 [==============================] - 63s 2s/step - loss: 0.6007 - binary_accuracy: 0.7005 - val_loss: 0.6075 - val_binary_accuracy: 0.6379

Epoch 00009: val_loss did not improve from 0.60738
Epoch 10/100
34/34 [==============================] - 63s 2s/step - loss: 0.5811 - binary_accuracy: 0.7196 - val_loss: 0.6235 - val_binary_accuracy: 0.6207

Epoch 00010: val_loss did not improve from 0.60738
Epoch 11/100
34/34 [==============================] - 61s 2s/step - loss: 0.5879 - binary_accuracy: 0.6939 - val_loss: 0.6450 - val_binary_accuracy: 0.6034

Epoch 00011: val_loss did not improve from 0.60738
Epoch 12/100
34/34 [==============================] - 68s 2s/step - loss: 0.5787 - binary_accuracy: 0.7056 - val_loss: 0.6144 - val_binary_accuracy: 0.6207

Epoch 00012: val_loss did not improve from 0.60738
Epoch 13/100
34/34 [==============================] - 74s 2s/step - loss: 0.5617 - binary_accuracy: 0.7257 - val_loss: 0.6463 - val_binary_accuracy: 0.6149

Epoch 00013: val_loss did not improve from 0.60738
Epoch 14/100
34/34 [==============================] - 56s 2s/step - loss: 0.5518 - binary_accuracy: 0.7383 - val_loss: 0.6344 - val_binary_accuracy: 0.5977

Epoch 00014: val_loss did not improve from 0.60738
Epoch 15/100
34/34 [==============================] - 55s 2s/step - loss: 0.5378 - binary_accuracy: 0.7500 - val_loss: 0.6576 - val_binary_accuracy: 0.6149

Epoch 00015: val_loss did not improve from 0.60738
Epoch 16/100
34/34 [==============================] - 55s 2s/step - loss: 0.5347 - binary_accuracy: 0.7509 - val_loss: 0.6326 - val_binary_accuracy: 0.6207

Epoch 00016: val_loss did not improve from 0.60738
Epoch 17/100
34/34 [==============================] - 54s 2s/step - loss: 0.5131 - binary_accuracy: 0.7607 - val_loss: 0.7521 - val_binary_accuracy: 0.5977

Epoch 00017: val_loss did not improve from 0.60738
Epoch 18/100
34/34 [==============================] - 54s 2s/step - loss: 0.5129 - binary_accuracy: 0.7673 - val_loss: 0.6924 - val_binary_accuracy: 0.6322

Epoch 00018: val_loss did not improve from 0.60738
Epoch 19/100
34/34 [==============================] - 54s 2s/step - loss: 0.5047 - binary_accuracy: 0.7692 - val_loss: 0.7069 - val_binary_accuracy: 0.6322

Epoch 00019: val_loss did not improve from 0.60738
Epoch 20/100
34/34 [==============================] - 54s 2s/step - loss: 0.4905 - binary_accuracy: 0.7776 - val_loss: 0.7634 - val_binary_accuracy: 0.5920

Epoch 00020: val_loss did not improve from 0.60738
Epoch 21/100
34/34 [==============================] - 54s 2s/step - loss: 0.4840 - binary_accuracy: 0.7813 - val_loss: 0.6591 - val_binary_accuracy: 0.6207

Epoch 00021: val_loss did not improve from 0.60738
Epoch 22/100
34/34 [==============================] - 53s 2s/step - loss: 0.4720 - binary_accuracy: 0.7897 - val_loss: 0.6511 - val_binary_accuracy: 0.6264

Epoch 00022: val_loss did not improve from 0.60738
Epoch 23/100
34/34 [==============================] - 54s 2s/step - loss: 0.4478 - binary_accuracy: 0.8093 - val_loss: 0.6817 - val_binary_accuracy: 0.6207

Epoch 00023: val_loss did not improve from 0.60738
In [24]:
pred_Y_1, ground_truth_1, evaluation_dic_1 = predict_and_evaluate_model(model_1, "model_1", test_gen, steps=len(test_df)/64)
22/21 [==============================] - 22s 1s/step
Model prediction min: 0.0785
Model prediction max: 0.926
  • After 8th epoch the validation loss started to increase, while the training loss continued to increase suggesting overfitting.
  • The validation accuracy started to decrease after 3rd epoch confirming overfitting (best model weights were saved after 5th epoch).
  • The prediction distibution is wide and exhibits two peaks, each for one class. The peak near 1.0, i.e. pneumonia positive class, has more counts, then the peak near 0.0, i.e pneumonia negative class. In contrast the test set comprises of 20% of positive pneumonia cases.
  • The AUC amounts to 0.61.

In the next model the dropout will be increased to 0.5 to avoid overfitting.

Model 2

In [25]:
# The same as model 1 with increased dropout 

# Model 2: 
## transfer layer: conv5_block16_1_conv (idx 420)
## learning rate E-4
## dropout 0.5

model_2_pretrained = load_pretrained_model(pretrained_model=DenseNet121)
model_2 = build_model(model_2_pretrained, dropout=0.5)
save_model(model_2, 'model_2')
Pre-trained model layers and their trainability
input_2 False
zero_padding2d_3 False
conv1/conv False
conv1/bn False
conv1/relu False
zero_padding2d_4 False
pool1 False
conv2_block1_0_bn False
conv2_block1_0_relu False
conv2_block1_1_conv False
conv2_block1_1_bn False
conv2_block1_1_relu False
conv2_block1_2_conv False
conv2_block1_concat False
conv2_block2_0_bn False
conv2_block2_0_relu False
conv2_block2_1_conv False
conv2_block2_1_bn False
conv2_block2_1_relu False
conv2_block2_2_conv False
conv2_block2_concat False
conv2_block3_0_bn False
conv2_block3_0_relu False
conv2_block3_1_conv False
conv2_block3_1_bn False
conv2_block3_1_relu False
conv2_block3_2_conv False
conv2_block3_concat False
conv2_block4_0_bn False
conv2_block4_0_relu False
conv2_block4_1_conv False
conv2_block4_1_bn False
conv2_block4_1_relu False
conv2_block4_2_conv False
conv2_block4_concat False
conv2_block5_0_bn False
conv2_block5_0_relu False
conv2_block5_1_conv False
conv2_block5_1_bn False
conv2_block5_1_relu False
conv2_block5_2_conv False
conv2_block5_concat False
conv2_block6_0_bn False
conv2_block6_0_relu False
conv2_block6_1_conv False
conv2_block6_1_bn False
conv2_block6_1_relu False
conv2_block6_2_conv False
conv2_block6_concat False
pool2_bn False
pool2_relu False
pool2_conv False
pool2_pool False
conv3_block1_0_bn False
conv3_block1_0_relu False
conv3_block1_1_conv False
conv3_block1_1_bn False
conv3_block1_1_relu False
conv3_block1_2_conv False
conv3_block1_concat False
conv3_block2_0_bn False
conv3_block2_0_relu False
conv3_block2_1_conv False
conv3_block2_1_bn False
conv3_block2_1_relu False
conv3_block2_2_conv False
conv3_block2_concat False
conv3_block3_0_bn False
conv3_block3_0_relu False
conv3_block3_1_conv False
conv3_block3_1_bn False
conv3_block3_1_relu False
conv3_block3_2_conv False
conv3_block3_concat False
conv3_block4_0_bn False
conv3_block4_0_relu False
conv3_block4_1_conv False
conv3_block4_1_bn False
conv3_block4_1_relu False
conv3_block4_2_conv False
conv3_block4_concat False
conv3_block5_0_bn False
conv3_block5_0_relu False
conv3_block5_1_conv False
conv3_block5_1_bn False
conv3_block5_1_relu False
conv3_block5_2_conv False
conv3_block5_concat False
conv3_block6_0_bn False
conv3_block6_0_relu False
conv3_block6_1_conv False
conv3_block6_1_bn False
conv3_block6_1_relu False
conv3_block6_2_conv False
conv3_block6_concat False
conv3_block7_0_bn False
conv3_block7_0_relu False
conv3_block7_1_conv False
conv3_block7_1_bn False
conv3_block7_1_relu False
conv3_block7_2_conv False
conv3_block7_concat False
conv3_block8_0_bn False
conv3_block8_0_relu False
conv3_block8_1_conv False
conv3_block8_1_bn False
conv3_block8_1_relu False
conv3_block8_2_conv False
conv3_block8_concat False
conv3_block9_0_bn False
conv3_block9_0_relu False
conv3_block9_1_conv False
conv3_block9_1_bn False
conv3_block9_1_relu False
conv3_block9_2_conv False
conv3_block9_concat False
conv3_block10_0_bn False
conv3_block10_0_relu False
conv3_block10_1_conv False
conv3_block10_1_bn False
conv3_block10_1_relu False
conv3_block10_2_conv False
conv3_block10_concat False
conv3_block11_0_bn False
conv3_block11_0_relu False
conv3_block11_1_conv False
conv3_block11_1_bn False
conv3_block11_1_relu False
conv3_block11_2_conv False
conv3_block11_concat False
conv3_block12_0_bn False
conv3_block12_0_relu False
conv3_block12_1_conv False
conv3_block12_1_bn False
conv3_block12_1_relu False
conv3_block12_2_conv False
conv3_block12_concat False
pool3_bn False
pool3_relu False
pool3_conv False
pool3_pool False
conv4_block1_0_bn False
conv4_block1_0_relu False
conv4_block1_1_conv False
conv4_block1_1_bn False
conv4_block1_1_relu False
conv4_block1_2_conv False
conv4_block1_concat False
conv4_block2_0_bn False
conv4_block2_0_relu False
conv4_block2_1_conv False
conv4_block2_1_bn False
conv4_block2_1_relu False
conv4_block2_2_conv False
conv4_block2_concat False
conv4_block3_0_bn False
conv4_block3_0_relu False
conv4_block3_1_conv False
conv4_block3_1_bn False
conv4_block3_1_relu False
conv4_block3_2_conv False
conv4_block3_concat False
conv4_block4_0_bn False
conv4_block4_0_relu False
conv4_block4_1_conv False
conv4_block4_1_bn False
conv4_block4_1_relu False
conv4_block4_2_conv False
conv4_block4_concat False
conv4_block5_0_bn False
conv4_block5_0_relu False
conv4_block5_1_conv False
conv4_block5_1_bn False
conv4_block5_1_relu False
conv4_block5_2_conv False
conv4_block5_concat False
conv4_block6_0_bn False
conv4_block6_0_relu False
conv4_block6_1_conv False
conv4_block6_1_bn False
conv4_block6_1_relu False
conv4_block6_2_conv False
conv4_block6_concat False
conv4_block7_0_bn False
conv4_block7_0_relu False
conv4_block7_1_conv False
conv4_block7_1_bn False
conv4_block7_1_relu False
conv4_block7_2_conv False
conv4_block7_concat False
conv4_block8_0_bn False
conv4_block8_0_relu False
conv4_block8_1_conv False
conv4_block8_1_bn False
conv4_block8_1_relu False
conv4_block8_2_conv False
conv4_block8_concat False
conv4_block9_0_bn False
conv4_block9_0_relu False
conv4_block9_1_conv False
conv4_block9_1_bn False
conv4_block9_1_relu False
conv4_block9_2_conv False
conv4_block9_concat False
conv4_block10_0_bn False
conv4_block10_0_relu False
conv4_block10_1_conv False
conv4_block10_1_bn False
conv4_block10_1_relu False
conv4_block10_2_conv False
conv4_block10_concat False
conv4_block11_0_bn False
conv4_block11_0_relu False
conv4_block11_1_conv False
conv4_block11_1_bn False
conv4_block11_1_relu False
conv4_block11_2_conv False
conv4_block11_concat False
conv4_block12_0_bn False
conv4_block12_0_relu False
conv4_block12_1_conv False
conv4_block12_1_bn False
conv4_block12_1_relu False
conv4_block12_2_conv False
conv4_block12_concat False
conv4_block13_0_bn False
conv4_block13_0_relu False
conv4_block13_1_conv False
conv4_block13_1_bn False
conv4_block13_1_relu False
conv4_block13_2_conv False
conv4_block13_concat False
conv4_block14_0_bn False
conv4_block14_0_relu False
conv4_block14_1_conv False
conv4_block14_1_bn False
conv4_block14_1_relu False
conv4_block14_2_conv False
conv4_block14_concat False
conv4_block15_0_bn False
conv4_block15_0_relu False
conv4_block15_1_conv False
conv4_block15_1_bn False
conv4_block15_1_relu False
conv4_block15_2_conv False
conv4_block15_concat False
conv4_block16_0_bn False
conv4_block16_0_relu False
conv4_block16_1_conv False
conv4_block16_1_bn False
conv4_block16_1_relu False
conv4_block16_2_conv False
conv4_block16_concat False
conv4_block17_0_bn False
conv4_block17_0_relu False
conv4_block17_1_conv False
conv4_block17_1_bn False
conv4_block17_1_relu False
conv4_block17_2_conv False
conv4_block17_concat False
conv4_block18_0_bn False
conv4_block18_0_relu False
conv4_block18_1_conv False
conv4_block18_1_bn False
conv4_block18_1_relu False
conv4_block18_2_conv False
conv4_block18_concat False
conv4_block19_0_bn False
conv4_block19_0_relu False
conv4_block19_1_conv False
conv4_block19_1_bn False
conv4_block19_1_relu False
conv4_block19_2_conv False
conv4_block19_concat False
conv4_block20_0_bn False
conv4_block20_0_relu False
conv4_block20_1_conv False
conv4_block20_1_bn False
conv4_block20_1_relu False
conv4_block20_2_conv False
conv4_block20_concat False
conv4_block21_0_bn False
conv4_block21_0_relu False
conv4_block21_1_conv False
conv4_block21_1_bn False
conv4_block21_1_relu False
conv4_block21_2_conv False
conv4_block21_concat False
conv4_block22_0_bn False
conv4_block22_0_relu False
conv4_block22_1_conv False
conv4_block22_1_bn False
conv4_block22_1_relu False
conv4_block22_2_conv False
conv4_block22_concat False
conv4_block23_0_bn False
conv4_block23_0_relu False
conv4_block23_1_conv False
conv4_block23_1_bn False
conv4_block23_1_relu False
conv4_block23_2_conv False
conv4_block23_concat False
conv4_block24_0_bn False
conv4_block24_0_relu False
conv4_block24_1_conv False
conv4_block24_1_bn False
conv4_block24_1_relu False
conv4_block24_2_conv False
conv4_block24_concat False
pool4_bn False
pool4_relu False
pool4_conv False
pool4_pool False
conv5_block1_0_bn False
conv5_block1_0_relu False
conv5_block1_1_conv False
conv5_block1_1_bn False
conv5_block1_1_relu False
conv5_block1_2_conv False
conv5_block1_concat False
conv5_block2_0_bn False
conv5_block2_0_relu False
conv5_block2_1_conv False
conv5_block2_1_bn False
conv5_block2_1_relu False
conv5_block2_2_conv False
conv5_block2_concat False
conv5_block3_0_bn False
conv5_block3_0_relu False
conv5_block3_1_conv False
conv5_block3_1_bn False
conv5_block3_1_relu False
conv5_block3_2_conv False
conv5_block3_concat False
conv5_block4_0_bn False
conv5_block4_0_relu False
conv5_block4_1_conv False
conv5_block4_1_bn False
conv5_block4_1_relu False
conv5_block4_2_conv False
conv5_block4_concat False
conv5_block5_0_bn False
conv5_block5_0_relu False
conv5_block5_1_conv False
conv5_block5_1_bn False
conv5_block5_1_relu False
conv5_block5_2_conv False
conv5_block5_concat False
conv5_block6_0_bn False
conv5_block6_0_relu False
conv5_block6_1_conv False
conv5_block6_1_bn False
conv5_block6_1_relu False
conv5_block6_2_conv False
conv5_block6_concat False
conv5_block7_0_bn False
conv5_block7_0_relu False
conv5_block7_1_conv False
conv5_block7_1_bn False
conv5_block7_1_relu False
conv5_block7_2_conv False
conv5_block7_concat False
conv5_block8_0_bn False
conv5_block8_0_relu False
conv5_block8_1_conv False
conv5_block8_1_bn False
conv5_block8_1_relu False
conv5_block8_2_conv False
conv5_block8_concat False
conv5_block9_0_bn False
conv5_block9_0_relu False
conv5_block9_1_conv False
conv5_block9_1_bn False
conv5_block9_1_relu False
conv5_block9_2_conv False
conv5_block9_concat False
conv5_block10_0_bn False
conv5_block10_0_relu False
conv5_block10_1_conv False
conv5_block10_1_bn False
conv5_block10_1_relu False
conv5_block10_2_conv False
conv5_block10_concat False
conv5_block11_0_bn False
conv5_block11_0_relu False
conv5_block11_1_conv False
conv5_block11_1_bn False
conv5_block11_1_relu False
conv5_block11_2_conv False
conv5_block11_concat False
conv5_block12_0_bn False
conv5_block12_0_relu False
conv5_block12_1_conv False
conv5_block12_1_bn False
conv5_block12_1_relu False
conv5_block12_2_conv False
conv5_block12_concat False
conv5_block13_0_bn False
conv5_block13_0_relu False
conv5_block13_1_conv False
conv5_block13_1_bn False
conv5_block13_1_relu False
conv5_block13_2_conv False
conv5_block13_concat False
conv5_block14_0_bn False
conv5_block14_0_relu False
conv5_block14_1_conv False
conv5_block14_1_bn False
conv5_block14_1_relu False
conv5_block14_2_conv False
conv5_block14_concat False
conv5_block15_0_bn False
conv5_block15_0_relu False
conv5_block15_1_conv False
conv5_block15_1_bn False
conv5_block15_1_relu False
conv5_block15_2_conv False
conv5_block15_concat False
conv5_block16_0_bn False
conv5_block16_0_relu False
conv5_block16_1_conv True
conv5_block16_1_bn True
conv5_block16_1_relu True
conv5_block16_2_conv True
conv5_block16_concat True
bn True
relu True
avg_pool True
fc1000 True
Saved model architecture under model_2.json
In [26]:
model_2_hist = train_model(model_2, 'model_2', train_gen, valid_gen, epochs=100)
Epoch 1/100
34/34 [==============================] - 62s 2s/step - loss: 0.6931 - binary_accuracy: 0.4995 - val_loss: 0.6930 - val_binary_accuracy: 0.5172

Epoch 00001: val_loss improved from inf to 0.69304, saving model to model_2.best.hdf5
Epoch 2/100
34/34 [==============================] - 51s 1s/step - loss: 0.6930 - binary_accuracy: 0.5308 - val_loss: 0.6927 - val_binary_accuracy: 0.5287

Epoch 00002: val_loss improved from 0.69304 to 0.69271, saving model to model_2.best.hdf5
Epoch 3/100
34/34 [==============================] - 54s 2s/step - loss: 0.6926 - binary_accuracy: 0.5491 - val_loss: 0.6921 - val_binary_accuracy: 0.5460

Epoch 00003: val_loss improved from 0.69271 to 0.69207, saving model to model_2.best.hdf5
Epoch 4/100
34/34 [==============================] - 54s 2s/step - loss: 0.6916 - binary_accuracy: 0.5720 - val_loss: 0.6928 - val_binary_accuracy: 0.5345

Epoch 00004: val_loss did not improve from 0.69207
Epoch 5/100
34/34 [==============================] - 56s 2s/step - loss: 0.6874 - binary_accuracy: 0.6037 - val_loss: 0.6912 - val_binary_accuracy: 0.5460

Epoch 00005: val_loss improved from 0.69207 to 0.69122, saving model to model_2.best.hdf5
Epoch 6/100
34/34 [==============================] - 56s 2s/step - loss: 0.6767 - binary_accuracy: 0.6346 - val_loss: 0.6929 - val_binary_accuracy: 0.5287

Epoch 00006: val_loss did not improve from 0.69122
Epoch 7/100
34/34 [==============================] - 55s 2s/step - loss: 0.6585 - binary_accuracy: 0.6523 - val_loss: 0.7042 - val_binary_accuracy: 0.5230

Epoch 00007: val_loss did not improve from 0.69122
Epoch 8/100
34/34 [==============================] - 56s 2s/step - loss: 0.6414 - binary_accuracy: 0.6556 - val_loss: 0.7325 - val_binary_accuracy: 0.5460

Epoch 00008: val_loss did not improve from 0.69122
Epoch 9/100
34/34 [==============================] - 56s 2s/step - loss: 0.6315 - binary_accuracy: 0.6575 - val_loss: 0.7474 - val_binary_accuracy: 0.5402

Epoch 00009: val_loss did not improve from 0.69122
Epoch 10/100
34/34 [==============================] - 56s 2s/step - loss: 0.6253 - binary_accuracy: 0.6780 - val_loss: 0.7430 - val_binary_accuracy: 0.5287

Epoch 00010: val_loss did not improve from 0.69122
Epoch 11/100
34/34 [==============================] - 55s 2s/step - loss: 0.6158 - binary_accuracy: 0.6855 - val_loss: 0.7733 - val_binary_accuracy: 0.5575

Epoch 00011: val_loss did not improve from 0.69122
Epoch 12/100
34/34 [==============================] - 55s 2s/step - loss: 0.6066 - binary_accuracy: 0.6921 - val_loss: 0.8112 - val_binary_accuracy: 0.5230

Epoch 00012: val_loss did not improve from 0.69122
Epoch 13/100
34/34 [==============================] - 55s 2s/step - loss: 0.6037 - binary_accuracy: 0.6935 - val_loss: 0.8259 - val_binary_accuracy: 0.5172

Epoch 00013: val_loss did not improve from 0.69122
Epoch 14/100
34/34 [==============================] - 54s 2s/step - loss: 0.5822 - binary_accuracy: 0.7131 - val_loss: 0.8361 - val_binary_accuracy: 0.5115

Epoch 00014: val_loss did not improve from 0.69122
Epoch 15/100
34/34 [==============================] - 53s 2s/step - loss: 0.5807 - binary_accuracy: 0.7145 - val_loss: 0.8937 - val_binary_accuracy: 0.5230

Epoch 00015: val_loss did not improve from 0.69122
Epoch 16/100
34/34 [==============================] - 53s 2s/step - loss: 0.5788 - binary_accuracy: 0.7131 - val_loss: 0.8859 - val_binary_accuracy: 0.5115

Epoch 00016: val_loss did not improve from 0.69122
Epoch 17/100
34/34 [==============================] - 54s 2s/step - loss: 0.5684 - binary_accuracy: 0.7234 - val_loss: 0.9342 - val_binary_accuracy: 0.5172

Epoch 00017: val_loss did not improve from 0.69122
Epoch 18/100
34/34 [==============================] - 53s 2s/step - loss: 0.5497 - binary_accuracy: 0.7374 - val_loss: 0.9459 - val_binary_accuracy: 0.5230

Epoch 00018: val_loss did not improve from 0.69122
Epoch 19/100
34/34 [==============================] - 54s 2s/step - loss: 0.5470 - binary_accuracy: 0.7444 - val_loss: 0.9569 - val_binary_accuracy: 0.5402

Epoch 00019: val_loss did not improve from 0.69122
Epoch 20/100
34/34 [==============================] - 53s 2s/step - loss: 0.5357 - binary_accuracy: 0.7467 - val_loss: 0.9951 - val_binary_accuracy: 0.5230

Epoch 00020: val_loss did not improve from 0.69122
In [27]:
pred_Y_2, ground_truth_2, evaluation_dic_2 = predict_and_evaluate_model(model_2, "model_2", test_gen, steps=len(test_df)/64)
22/21 [==============================] - 22s 1s/step
Model prediction min: 0.155
Model prediction max: 0.868
  • After 5th epoch the validation loss started to increase, while the training loss continued to increase suggesting overfitting.
  • The validation accuracy started to decrease after 3rd epoch confirming overfitting (best model weights were saved after 5th epoch).
  • The prediction distibution exhibits single peak near 1.0, i.e. pneumonia positive class, meaning that most of the scans are predicted as pneumonia positve.

The next model will be based on model 1 with increased starting learnig rate, as smaller learning rate ranges increase the risk of stucking in local minimum resulting in lack of improvement in validation accuracy.

Model 3

In [11]:
# The same as model 1 with increased learning rate 

# Model 3: 
## transfer layer: conv5_block16_1_conv (idx 420)
## learning rate E-3
## dropout 0.2

model_3_pretrained = load_pretrained_model(pretrained_model=DenseNet121)
model_3 = build_model(model_3_pretrained, lr=0.01, dropout=0.2)
save_model(model_3, 'model_3')
Pre-trained model layers and their trainability
input_2 False
zero_padding2d_3 False
conv1/conv False
conv1/bn False
conv1/relu False
zero_padding2d_4 False
pool1 False
conv2_block1_0_bn False
conv2_block1_0_relu False
conv2_block1_1_conv False
conv2_block1_1_bn False
conv2_block1_1_relu False
conv2_block1_2_conv False
conv2_block1_concat False
conv2_block2_0_bn False
conv2_block2_0_relu False
conv2_block2_1_conv False
conv2_block2_1_bn False
conv2_block2_1_relu False
conv2_block2_2_conv False
conv2_block2_concat False
conv2_block3_0_bn False
conv2_block3_0_relu False
conv2_block3_1_conv False
conv2_block3_1_bn False
conv2_block3_1_relu False
conv2_block3_2_conv False
conv2_block3_concat False
conv2_block4_0_bn False
conv2_block4_0_relu False
conv2_block4_1_conv False
conv2_block4_1_bn False
conv2_block4_1_relu False
conv2_block4_2_conv False
conv2_block4_concat False
conv2_block5_0_bn False
conv2_block5_0_relu False
conv2_block5_1_conv False
conv2_block5_1_bn False
conv2_block5_1_relu False
conv2_block5_2_conv False
conv2_block5_concat False
conv2_block6_0_bn False
conv2_block6_0_relu False
conv2_block6_1_conv False
conv2_block6_1_bn False
conv2_block6_1_relu False
conv2_block6_2_conv False
conv2_block6_concat False
pool2_bn False
pool2_relu False
pool2_conv False
pool2_pool False
conv3_block1_0_bn False
conv3_block1_0_relu False
conv3_block1_1_conv False
conv3_block1_1_bn False
conv3_block1_1_relu False
conv3_block1_2_conv False
conv3_block1_concat False
conv3_block2_0_bn False
conv3_block2_0_relu False
conv3_block2_1_conv False
conv3_block2_1_bn False
conv3_block2_1_relu False
conv3_block2_2_conv False
conv3_block2_concat False
conv3_block3_0_bn False
conv3_block3_0_relu False
conv3_block3_1_conv False
conv3_block3_1_bn False
conv3_block3_1_relu False
conv3_block3_2_conv False
conv3_block3_concat False
conv3_block4_0_bn False
conv3_block4_0_relu False
conv3_block4_1_conv False
conv3_block4_1_bn False
conv3_block4_1_relu False
conv3_block4_2_conv False
conv3_block4_concat False
conv3_block5_0_bn False
conv3_block5_0_relu False
conv3_block5_1_conv False
conv3_block5_1_bn False
conv3_block5_1_relu False
conv3_block5_2_conv False
conv3_block5_concat False
conv3_block6_0_bn False
conv3_block6_0_relu False
conv3_block6_1_conv False
conv3_block6_1_bn False
conv3_block6_1_relu False
conv3_block6_2_conv False
conv3_block6_concat False
conv3_block7_0_bn False
conv3_block7_0_relu False
conv3_block7_1_conv False
conv3_block7_1_bn False
conv3_block7_1_relu False
conv3_block7_2_conv False
conv3_block7_concat False
conv3_block8_0_bn False
conv3_block8_0_relu False
conv3_block8_1_conv False
conv3_block8_1_bn False
conv3_block8_1_relu False
conv3_block8_2_conv False
conv3_block8_concat False
conv3_block9_0_bn False
conv3_block9_0_relu False
conv3_block9_1_conv False
conv3_block9_1_bn False
conv3_block9_1_relu False
conv3_block9_2_conv False
conv3_block9_concat False
conv3_block10_0_bn False
conv3_block10_0_relu False
conv3_block10_1_conv False
conv3_block10_1_bn False
conv3_block10_1_relu False
conv3_block10_2_conv False
conv3_block10_concat False
conv3_block11_0_bn False
conv3_block11_0_relu False
conv3_block11_1_conv False
conv3_block11_1_bn False
conv3_block11_1_relu False
conv3_block11_2_conv False
conv3_block11_concat False
conv3_block12_0_bn False
conv3_block12_0_relu False
conv3_block12_1_conv False
conv3_block12_1_bn False
conv3_block12_1_relu False
conv3_block12_2_conv False
conv3_block12_concat False
pool3_bn False
pool3_relu False
pool3_conv False
pool3_pool False
conv4_block1_0_bn False
conv4_block1_0_relu False
conv4_block1_1_conv False
conv4_block1_1_bn False
conv4_block1_1_relu False
conv4_block1_2_conv False
conv4_block1_concat False
conv4_block2_0_bn False
conv4_block2_0_relu False
conv4_block2_1_conv False
conv4_block2_1_bn False
conv4_block2_1_relu False
conv4_block2_2_conv False
conv4_block2_concat False
conv4_block3_0_bn False
conv4_block3_0_relu False
conv4_block3_1_conv False
conv4_block3_1_bn False
conv4_block3_1_relu False
conv4_block3_2_conv False
conv4_block3_concat False
conv4_block4_0_bn False
conv4_block4_0_relu False
conv4_block4_1_conv False
conv4_block4_1_bn False
conv4_block4_1_relu False
conv4_block4_2_conv False
conv4_block4_concat False
conv4_block5_0_bn False
conv4_block5_0_relu False
conv4_block5_1_conv False
conv4_block5_1_bn False
conv4_block5_1_relu False
conv4_block5_2_conv False
conv4_block5_concat False
conv4_block6_0_bn False
conv4_block6_0_relu False
conv4_block6_1_conv False
conv4_block6_1_bn False
conv4_block6_1_relu False
conv4_block6_2_conv False
conv4_block6_concat False
conv4_block7_0_bn False
conv4_block7_0_relu False
conv4_block7_1_conv False
conv4_block7_1_bn False
conv4_block7_1_relu False
conv4_block7_2_conv False
conv4_block7_concat False
conv4_block8_0_bn False
conv4_block8_0_relu False
conv4_block8_1_conv False
conv4_block8_1_bn False
conv4_block8_1_relu False
conv4_block8_2_conv False
conv4_block8_concat False
conv4_block9_0_bn False
conv4_block9_0_relu False
conv4_block9_1_conv False
conv4_block9_1_bn False
conv4_block9_1_relu False
conv4_block9_2_conv False
conv4_block9_concat False
conv4_block10_0_bn False
conv4_block10_0_relu False
conv4_block10_1_conv False
conv4_block10_1_bn False
conv4_block10_1_relu False
conv4_block10_2_conv False
conv4_block10_concat False
conv4_block11_0_bn False
conv4_block11_0_relu False
conv4_block11_1_conv False
conv4_block11_1_bn False
conv4_block11_1_relu False
conv4_block11_2_conv False
conv4_block11_concat False
conv4_block12_0_bn False
conv4_block12_0_relu False
conv4_block12_1_conv False
conv4_block12_1_bn False
conv4_block12_1_relu False
conv4_block12_2_conv False
conv4_block12_concat False
conv4_block13_0_bn False
conv4_block13_0_relu False
conv4_block13_1_conv False
conv4_block13_1_bn False
conv4_block13_1_relu False
conv4_block13_2_conv False
conv4_block13_concat False
conv4_block14_0_bn False
conv4_block14_0_relu False
conv4_block14_1_conv False
conv4_block14_1_bn False
conv4_block14_1_relu False
conv4_block14_2_conv False
conv4_block14_concat False
conv4_block15_0_bn False
conv4_block15_0_relu False
conv4_block15_1_conv False
conv4_block15_1_bn False
conv4_block15_1_relu False
conv4_block15_2_conv False
conv4_block15_concat False
conv4_block16_0_bn False
conv4_block16_0_relu False
conv4_block16_1_conv False
conv4_block16_1_bn False
conv4_block16_1_relu False
conv4_block16_2_conv False
conv4_block16_concat False
conv4_block17_0_bn False
conv4_block17_0_relu False
conv4_block17_1_conv False
conv4_block17_1_bn False
conv4_block17_1_relu False
conv4_block17_2_conv False
conv4_block17_concat False
conv4_block18_0_bn False
conv4_block18_0_relu False
conv4_block18_1_conv False
conv4_block18_1_bn False
conv4_block18_1_relu False
conv4_block18_2_conv False
conv4_block18_concat False
conv4_block19_0_bn False
conv4_block19_0_relu False
conv4_block19_1_conv False
conv4_block19_1_bn False
conv4_block19_1_relu False
conv4_block19_2_conv False
conv4_block19_concat False
conv4_block20_0_bn False
conv4_block20_0_relu False
conv4_block20_1_conv False
conv4_block20_1_bn False
conv4_block20_1_relu False
conv4_block20_2_conv False
conv4_block20_concat False
conv4_block21_0_bn False
conv4_block21_0_relu False
conv4_block21_1_conv False
conv4_block21_1_bn False
conv4_block21_1_relu False
conv4_block21_2_conv False
conv4_block21_concat False
conv4_block22_0_bn False
conv4_block22_0_relu False
conv4_block22_1_conv False
conv4_block22_1_bn False
conv4_block22_1_relu False
conv4_block22_2_conv False
conv4_block22_concat False
conv4_block23_0_bn False
conv4_block23_0_relu False
conv4_block23_1_conv False
conv4_block23_1_bn False
conv4_block23_1_relu False
conv4_block23_2_conv False
conv4_block23_concat False
conv4_block24_0_bn False
conv4_block24_0_relu False
conv4_block24_1_conv False
conv4_block24_1_bn False
conv4_block24_1_relu False
conv4_block24_2_conv False
conv4_block24_concat False
pool4_bn False
pool4_relu False
pool4_conv False
pool4_pool False
conv5_block1_0_bn False
conv5_block1_0_relu False
conv5_block1_1_conv False
conv5_block1_1_bn False
conv5_block1_1_relu False
conv5_block1_2_conv False
conv5_block1_concat False
conv5_block2_0_bn False
conv5_block2_0_relu False
conv5_block2_1_conv False
conv5_block2_1_bn False
conv5_block2_1_relu False
conv5_block2_2_conv False
conv5_block2_concat False
conv5_block3_0_bn False
conv5_block3_0_relu False
conv5_block3_1_conv False
conv5_block3_1_bn False
conv5_block3_1_relu False
conv5_block3_2_conv False
conv5_block3_concat False
conv5_block4_0_bn False
conv5_block4_0_relu False
conv5_block4_1_conv False
conv5_block4_1_bn False
conv5_block4_1_relu False
conv5_block4_2_conv False
conv5_block4_concat False
conv5_block5_0_bn False
conv5_block5_0_relu False
conv5_block5_1_conv False
conv5_block5_1_bn False
conv5_block5_1_relu False
conv5_block5_2_conv False
conv5_block5_concat False
conv5_block6_0_bn False
conv5_block6_0_relu False
conv5_block6_1_conv False
conv5_block6_1_bn False
conv5_block6_1_relu False
conv5_block6_2_conv False
conv5_block6_concat False
conv5_block7_0_bn False
conv5_block7_0_relu False
conv5_block7_1_conv False
conv5_block7_1_bn False
conv5_block7_1_relu False
conv5_block7_2_conv False
conv5_block7_concat False
conv5_block8_0_bn False
conv5_block8_0_relu False
conv5_block8_1_conv False
conv5_block8_1_bn False
conv5_block8_1_relu False
conv5_block8_2_conv False
conv5_block8_concat False
conv5_block9_0_bn False
conv5_block9_0_relu False
conv5_block9_1_conv False
conv5_block9_1_bn False
conv5_block9_1_relu False
conv5_block9_2_conv False
conv5_block9_concat False
conv5_block10_0_bn False
conv5_block10_0_relu False
conv5_block10_1_conv False
conv5_block10_1_bn False
conv5_block10_1_relu False
conv5_block10_2_conv False
conv5_block10_concat False
conv5_block11_0_bn False
conv5_block11_0_relu False
conv5_block11_1_conv False
conv5_block11_1_bn False
conv5_block11_1_relu False
conv5_block11_2_conv False
conv5_block11_concat False
conv5_block12_0_bn False
conv5_block12_0_relu False
conv5_block12_1_conv False
conv5_block12_1_bn False
conv5_block12_1_relu False
conv5_block12_2_conv False
conv5_block12_concat False
conv5_block13_0_bn False
conv5_block13_0_relu False
conv5_block13_1_conv False
conv5_block13_1_bn False
conv5_block13_1_relu False
conv5_block13_2_conv False
conv5_block13_concat False
conv5_block14_0_bn False
conv5_block14_0_relu False
conv5_block14_1_conv False
conv5_block14_1_bn False
conv5_block14_1_relu False
conv5_block14_2_conv False
conv5_block14_concat False
conv5_block15_0_bn False
conv5_block15_0_relu False
conv5_block15_1_conv False
conv5_block15_1_bn False
conv5_block15_1_relu False
conv5_block15_2_conv False
conv5_block15_concat False
conv5_block16_0_bn False
conv5_block16_0_relu False
conv5_block16_1_conv True
conv5_block16_1_bn True
conv5_block16_1_relu True
conv5_block16_2_conv True
conv5_block16_concat True
bn True
relu True
avg_pool True
fc1000 True
Saved model architecture under model_3.json
In [29]:
model_3_hist = train_model(model_3, 'model_3', train_gen, valid_gen, epochs=100)
Epoch 1/100
34/34 [==============================] - 60s 2s/step - loss: 0.6840 - binary_accuracy: 0.5710 - val_loss: 0.6795 - val_binary_accuracy: 0.5172

Epoch 00001: val_loss improved from inf to 0.67955, saving model to model_3.best.hdf5
Epoch 2/100
34/34 [==============================] - 51s 1s/step - loss: 0.6876 - binary_accuracy: 0.5963 - val_loss: 0.6842 - val_binary_accuracy: 0.6092

Epoch 00002: val_loss did not improve from 0.67955
Epoch 3/100
34/34 [==============================] - 54s 2s/step - loss: 0.6706 - binary_accuracy: 0.6196 - val_loss: 0.6546 - val_binary_accuracy: 0.6322

Epoch 00003: val_loss improved from 0.67955 to 0.65459, saving model to model_3.best.hdf5
Epoch 4/100
34/34 [==============================] - 53s 2s/step - loss: 0.6547 - binary_accuracy: 0.6336 - val_loss: 0.6985 - val_binary_accuracy: 0.4943

Epoch 00004: val_loss did not improve from 0.65459
Epoch 5/100
34/34 [==============================] - 53s 2s/step - loss: 0.6588 - binary_accuracy: 0.6014 - val_loss: 0.6940 - val_binary_accuracy: 0.5000

Epoch 00005: val_loss did not improve from 0.65459
Epoch 6/100
34/34 [==============================] - 53s 2s/step - loss: 0.6896 - binary_accuracy: 0.5061 - val_loss: 0.6931 - val_binary_accuracy: 0.5000

Epoch 00006: val_loss did not improve from 0.65459
Epoch 7/100
34/34 [==============================] - 53s 2s/step - loss: 0.6614 - binary_accuracy: 0.6117 - val_loss: 0.6820 - val_binary_accuracy: 0.5345

Epoch 00007: val_loss did not improve from 0.65459
Epoch 8/100
34/34 [==============================] - 53s 2s/step - loss: 0.6744 - binary_accuracy: 0.5710 - val_loss: 0.6835 - val_binary_accuracy: 0.5287

Epoch 00008: val_loss did not improve from 0.65459
Epoch 9/100
34/34 [==============================] - 54s 2s/step - loss: 0.6626 - binary_accuracy: 0.6290 - val_loss: 0.6802 - val_binary_accuracy: 0.5172

Epoch 00009: val_loss did not improve from 0.65459
Epoch 10/100
34/34 [==============================] - 52s 2s/step - loss: 0.6576 - binary_accuracy: 0.6290 - val_loss: 0.6991 - val_binary_accuracy: 0.5000

Epoch 00010: val_loss did not improve from 0.65459
Epoch 11/100
34/34 [==============================] - 53s 2s/step - loss: 0.6868 - binary_accuracy: 0.5210 - val_loss: 0.6923 - val_binary_accuracy: 0.5000

Epoch 00011: val_loss did not improve from 0.65459
Epoch 12/100
34/34 [==============================] - 53s 2s/step - loss: 0.6907 - binary_accuracy: 0.4944 - val_loss: 0.6931 - val_binary_accuracy: 0.5000

Epoch 00012: val_loss did not improve from 0.65459
Epoch 13/100
34/34 [==============================] - 53s 2s/step - loss: 0.6823 - binary_accuracy: 0.5206 - val_loss: 0.6821 - val_binary_accuracy: 0.5057

Epoch 00013: val_loss did not improve from 0.65459
Epoch 14/100
34/34 [==============================] - 53s 2s/step - loss: 0.6661 - binary_accuracy: 0.6014 - val_loss: 0.6142 - val_binary_accuracy: 0.5805

Epoch 00014: val_loss improved from 0.65459 to 0.61420, saving model to model_3.best.hdf5
Epoch 15/100
34/34 [==============================] - 53s 2s/step - loss: 0.6512 - binary_accuracy: 0.6350 - val_loss: 0.6351 - val_binary_accuracy: 0.5345

Epoch 00015: val_loss did not improve from 0.61420
Epoch 16/100
34/34 [==============================] - 54s 2s/step - loss: 0.6612 - binary_accuracy: 0.6341 - val_loss: 0.7119 - val_binary_accuracy: 0.5000

Epoch 00016: val_loss did not improve from 0.61420
Epoch 17/100
34/34 [==============================] - 53s 2s/step - loss: 0.6839 - binary_accuracy: 0.5481 - val_loss: 0.6925 - val_binary_accuracy: 0.5000

Epoch 00017: val_loss did not improve from 0.61420
Epoch 18/100
34/34 [==============================] - 53s 2s/step - loss: 0.6797 - binary_accuracy: 0.5266 - val_loss: 0.6925 - val_binary_accuracy: 0.5000

Epoch 00018: val_loss did not improve from 0.61420
Epoch 19/100
34/34 [==============================] - 52s 2s/step - loss: 0.6516 - binary_accuracy: 0.6458 - val_loss: 0.6522 - val_binary_accuracy: 0.5402

Epoch 00019: val_loss did not improve from 0.61420
Epoch 20/100
34/34 [==============================] - 53s 2s/step - loss: 0.6471 - binary_accuracy: 0.6458 - val_loss: 0.6526 - val_binary_accuracy: 0.6034

Epoch 00020: val_loss did not improve from 0.61420
Epoch 21/100
34/34 [==============================] - 53s 2s/step - loss: 0.6444 - binary_accuracy: 0.6593 - val_loss: 0.6666 - val_binary_accuracy: 0.5460

Epoch 00021: val_loss did not improve from 0.61420
Epoch 22/100
34/34 [==============================] - 53s 2s/step - loss: 0.6401 - binary_accuracy: 0.6472 - val_loss: 0.6378 - val_binary_accuracy: 0.5862

Epoch 00022: val_loss did not improve from 0.61420
Epoch 23/100
34/34 [==============================] - 53s 2s/step - loss: 0.6588 - binary_accuracy: 0.6435 - val_loss: 0.7132 - val_binary_accuracy: 0.5805

Epoch 00023: val_loss did not improve from 0.61420
Epoch 24/100
34/34 [==============================] - 53s 2s/step - loss: 0.6473 - binary_accuracy: 0.6551 - val_loss: 0.7347 - val_binary_accuracy: 0.5115

Epoch 00024: val_loss did not improve from 0.61420
Epoch 25/100
34/34 [==============================] - 53s 2s/step - loss: 0.6615 - binary_accuracy: 0.6308 - val_loss: 0.6923 - val_binary_accuracy: 0.5460

Epoch 00025: val_loss did not improve from 0.61420
Epoch 26/100
34/34 [==============================] - 52s 2s/step - loss: 0.6415 - binary_accuracy: 0.6621 - val_loss: 0.7175 - val_binary_accuracy: 0.5000

Epoch 00026: val_loss did not improve from 0.61420
Epoch 27/100
34/34 [==============================] - 52s 2s/step - loss: 0.6476 - binary_accuracy: 0.6444 - val_loss: 0.7078 - val_binary_accuracy: 0.5000

Epoch 00027: val_loss did not improve from 0.61420
Epoch 28/100
34/34 [==============================] - 52s 2s/step - loss: 0.6494 - binary_accuracy: 0.6472 - val_loss: 0.7088 - val_binary_accuracy: 0.5000

Epoch 00028: val_loss did not improve from 0.61420
Epoch 29/100
34/34 [==============================] - 53s 2s/step - loss: 0.6481 - binary_accuracy: 0.6537 - val_loss: 0.7161 - val_binary_accuracy: 0.5000

Epoch 00029: val_loss did not improve from 0.61420
In [30]:
pred_Y_3, ground_truth_3, evaluation_dic_3 = predict_and_evaluate_model(model_3, "model_3", test_gen, steps=len(test_df)/64)
22/21 [==============================] - 22s 1s/step
Model prediction min: 0.371
Model prediction max: 0.628
  • The training and validation loss did not decrease, while the accuracies did not increase during training, meaning that the model did not learn. Those signals are very noisy, suggesting that the learning rate ranges are too high and thus overshooting minimum/local minima.

  • The AUC amounts to 0.5, a value equal to random choice.

The next model will be based on model 1 with more layers frozen to avoid overfitting.

Model 4

In [22]:
# The same as model 1 with more layers frozen 

# Model 4: 
## transfer layer: avg_pool (idx 427)
## learning rate E-4
## dropout 0.2
model_4_pretrained = load_pretrained_model(pretrained_model=DenseNet121, transfer_layer='avg_pool', 
                          transfer_layer_idx=427)
model_4 = build_model(model_4_pretrained, dropout=0.2)
save_model(model_4, 'model_4')
Pre-trained model layers and their trainability
input_4 False
zero_padding2d_7 False
conv1/conv False
conv1/bn False
conv1/relu False
zero_padding2d_8 False
pool1 False
conv2_block1_0_bn False
conv2_block1_0_relu False
conv2_block1_1_conv False
conv2_block1_1_bn False
conv2_block1_1_relu False
conv2_block1_2_conv False
conv2_block1_concat False
conv2_block2_0_bn False
conv2_block2_0_relu False
conv2_block2_1_conv False
conv2_block2_1_bn False
conv2_block2_1_relu False
conv2_block2_2_conv False
conv2_block2_concat False
conv2_block3_0_bn False
conv2_block3_0_relu False
conv2_block3_1_conv False
conv2_block3_1_bn False
conv2_block3_1_relu False
conv2_block3_2_conv False
conv2_block3_concat False
conv2_block4_0_bn False
conv2_block4_0_relu False
conv2_block4_1_conv False
conv2_block4_1_bn False
conv2_block4_1_relu False
conv2_block4_2_conv False
conv2_block4_concat False
conv2_block5_0_bn False
conv2_block5_0_relu False
conv2_block5_1_conv False
conv2_block5_1_bn False
conv2_block5_1_relu False
conv2_block5_2_conv False
conv2_block5_concat False
conv2_block6_0_bn False
conv2_block6_0_relu False
conv2_block6_1_conv False
conv2_block6_1_bn False
conv2_block6_1_relu False
conv2_block6_2_conv False
conv2_block6_concat False
pool2_bn False
pool2_relu False
pool2_conv False
pool2_pool False
conv3_block1_0_bn False
conv3_block1_0_relu False
conv3_block1_1_conv False
conv3_block1_1_bn False
conv3_block1_1_relu False
conv3_block1_2_conv False
conv3_block1_concat False
conv3_block2_0_bn False
conv3_block2_0_relu False
conv3_block2_1_conv False
conv3_block2_1_bn False
conv3_block2_1_relu False
conv3_block2_2_conv False
conv3_block2_concat False
conv3_block3_0_bn False
conv3_block3_0_relu False
conv3_block3_1_conv False
conv3_block3_1_bn False
conv3_block3_1_relu False
conv3_block3_2_conv False
conv3_block3_concat False
conv3_block4_0_bn False
conv3_block4_0_relu False
conv3_block4_1_conv False
conv3_block4_1_bn False
conv3_block4_1_relu False
conv3_block4_2_conv False
conv3_block4_concat False
conv3_block5_0_bn False
conv3_block5_0_relu False
conv3_block5_1_conv False
conv3_block5_1_bn False
conv3_block5_1_relu False
conv3_block5_2_conv False
conv3_block5_concat False
conv3_block6_0_bn False
conv3_block6_0_relu False
conv3_block6_1_conv False
conv3_block6_1_bn False
conv3_block6_1_relu False
conv3_block6_2_conv False
conv3_block6_concat False
conv3_block7_0_bn False
conv3_block7_0_relu False
conv3_block7_1_conv False
conv3_block7_1_bn False
conv3_block7_1_relu False
conv3_block7_2_conv False
conv3_block7_concat False
conv3_block8_0_bn False
conv3_block8_0_relu False
conv3_block8_1_conv False
conv3_block8_1_bn False
conv3_block8_1_relu False
conv3_block8_2_conv False
conv3_block8_concat False
conv3_block9_0_bn False
conv3_block9_0_relu False
conv3_block9_1_conv False
conv3_block9_1_bn False
conv3_block9_1_relu False
conv3_block9_2_conv False
conv3_block9_concat False
conv3_block10_0_bn False
conv3_block10_0_relu False
conv3_block10_1_conv False
conv3_block10_1_bn False
conv3_block10_1_relu False
conv3_block10_2_conv False
conv3_block10_concat False
conv3_block11_0_bn False
conv3_block11_0_relu False
conv3_block11_1_conv False
conv3_block11_1_bn False
conv3_block11_1_relu False
conv3_block11_2_conv False
conv3_block11_concat False
conv3_block12_0_bn False
conv3_block12_0_relu False
conv3_block12_1_conv False
conv3_block12_1_bn False
conv3_block12_1_relu False
conv3_block12_2_conv False
conv3_block12_concat False
pool3_bn False
pool3_relu False
pool3_conv False
pool3_pool False
conv4_block1_0_bn False
conv4_block1_0_relu False
conv4_block1_1_conv False
conv4_block1_1_bn False
conv4_block1_1_relu False
conv4_block1_2_conv False
conv4_block1_concat False
conv4_block2_0_bn False
conv4_block2_0_relu False
conv4_block2_1_conv False
conv4_block2_1_bn False
conv4_block2_1_relu False
conv4_block2_2_conv False
conv4_block2_concat False
conv4_block3_0_bn False
conv4_block3_0_relu False
conv4_block3_1_conv False
conv4_block3_1_bn False
conv4_block3_1_relu False
conv4_block3_2_conv False
conv4_block3_concat False
conv4_block4_0_bn False
conv4_block4_0_relu False
conv4_block4_1_conv False
conv4_block4_1_bn False
conv4_block4_1_relu False
conv4_block4_2_conv False
conv4_block4_concat False
conv4_block5_0_bn False
conv4_block5_0_relu False
conv4_block5_1_conv False
conv4_block5_1_bn False
conv4_block5_1_relu False
conv4_block5_2_conv False
conv4_block5_concat False
conv4_block6_0_bn False
conv4_block6_0_relu False
conv4_block6_1_conv False
conv4_block6_1_bn False
conv4_block6_1_relu False
conv4_block6_2_conv False
conv4_block6_concat False
conv4_block7_0_bn False
conv4_block7_0_relu False
conv4_block7_1_conv False
conv4_block7_1_bn False
conv4_block7_1_relu False
conv4_block7_2_conv False
conv4_block7_concat False
conv4_block8_0_bn False
conv4_block8_0_relu False
conv4_block8_1_conv False
conv4_block8_1_bn False
conv4_block8_1_relu False
conv4_block8_2_conv False
conv4_block8_concat False
conv4_block9_0_bn False
conv4_block9_0_relu False
conv4_block9_1_conv False
conv4_block9_1_bn False
conv4_block9_1_relu False
conv4_block9_2_conv False
conv4_block9_concat False
conv4_block10_0_bn False
conv4_block10_0_relu False
conv4_block10_1_conv False
conv4_block10_1_bn False
conv4_block10_1_relu False
conv4_block10_2_conv False
conv4_block10_concat False
conv4_block11_0_bn False
conv4_block11_0_relu False
conv4_block11_1_conv False
conv4_block11_1_bn False
conv4_block11_1_relu False
conv4_block11_2_conv False
conv4_block11_concat False
conv4_block12_0_bn False
conv4_block12_0_relu False
conv4_block12_1_conv False
conv4_block12_1_bn False
conv4_block12_1_relu False
conv4_block12_2_conv False
conv4_block12_concat False
conv4_block13_0_bn False
conv4_block13_0_relu False
conv4_block13_1_conv False
conv4_block13_1_bn False
conv4_block13_1_relu False
conv4_block13_2_conv False
conv4_block13_concat False
conv4_block14_0_bn False
conv4_block14_0_relu False
conv4_block14_1_conv False
conv4_block14_1_bn False
conv4_block14_1_relu False
conv4_block14_2_conv False
conv4_block14_concat False
conv4_block15_0_bn False
conv4_block15_0_relu False
conv4_block15_1_conv False
conv4_block15_1_bn False
conv4_block15_1_relu False
conv4_block15_2_conv False
conv4_block15_concat False
conv4_block16_0_bn False
conv4_block16_0_relu False
conv4_block16_1_conv False
conv4_block16_1_bn False
conv4_block16_1_relu False
conv4_block16_2_conv False
conv4_block16_concat False
conv4_block17_0_bn False
conv4_block17_0_relu False
conv4_block17_1_conv False
conv4_block17_1_bn False
conv4_block17_1_relu False
conv4_block17_2_conv False
conv4_block17_concat False
conv4_block18_0_bn False
conv4_block18_0_relu False
conv4_block18_1_conv False
conv4_block18_1_bn False
conv4_block18_1_relu False
conv4_block18_2_conv False
conv4_block18_concat False
conv4_block19_0_bn False
conv4_block19_0_relu False
conv4_block19_1_conv False
conv4_block19_1_bn False
conv4_block19_1_relu False
conv4_block19_2_conv False
conv4_block19_concat False
conv4_block20_0_bn False
conv4_block20_0_relu False
conv4_block20_1_conv False
conv4_block20_1_bn False
conv4_block20_1_relu False
conv4_block20_2_conv False
conv4_block20_concat False
conv4_block21_0_bn False
conv4_block21_0_relu False
conv4_block21_1_conv False
conv4_block21_1_bn False
conv4_block21_1_relu False
conv4_block21_2_conv False
conv4_block21_concat False
conv4_block22_0_bn False
conv4_block22_0_relu False
conv4_block22_1_conv False
conv4_block22_1_bn False
conv4_block22_1_relu False
conv4_block22_2_conv False
conv4_block22_concat False
conv4_block23_0_bn False
conv4_block23_0_relu False
conv4_block23_1_conv False
conv4_block23_1_bn False
conv4_block23_1_relu False
conv4_block23_2_conv False
conv4_block23_concat False
conv4_block24_0_bn False
conv4_block24_0_relu False
conv4_block24_1_conv False
conv4_block24_1_bn False
conv4_block24_1_relu False
conv4_block24_2_conv False
conv4_block24_concat False
pool4_bn False
pool4_relu False
pool4_conv False
pool4_pool False
conv5_block1_0_bn False
conv5_block1_0_relu False
conv5_block1_1_conv False
conv5_block1_1_bn False
conv5_block1_1_relu False
conv5_block1_2_conv False
conv5_block1_concat False
conv5_block2_0_bn False
conv5_block2_0_relu False
conv5_block2_1_conv False
conv5_block2_1_bn False
conv5_block2_1_relu False
conv5_block2_2_conv False
conv5_block2_concat False
conv5_block3_0_bn False
conv5_block3_0_relu False
conv5_block3_1_conv False
conv5_block3_1_bn False
conv5_block3_1_relu False
conv5_block3_2_conv False
conv5_block3_concat False
conv5_block4_0_bn False
conv5_block4_0_relu False
conv5_block4_1_conv False
conv5_block4_1_bn False
conv5_block4_1_relu False
conv5_block4_2_conv False
conv5_block4_concat False
conv5_block5_0_bn False
conv5_block5_0_relu False
conv5_block5_1_conv False
conv5_block5_1_bn False
conv5_block5_1_relu False
conv5_block5_2_conv False
conv5_block5_concat False
conv5_block6_0_bn False
conv5_block6_0_relu False
conv5_block6_1_conv False
conv5_block6_1_bn False
conv5_block6_1_relu False
conv5_block6_2_conv False
conv5_block6_concat False
conv5_block7_0_bn False
conv5_block7_0_relu False
conv5_block7_1_conv False
conv5_block7_1_bn False
conv5_block7_1_relu False
conv5_block7_2_conv False
conv5_block7_concat False
conv5_block8_0_bn False
conv5_block8_0_relu False
conv5_block8_1_conv False
conv5_block8_1_bn False
conv5_block8_1_relu False
conv5_block8_2_conv False
conv5_block8_concat False
conv5_block9_0_bn False
conv5_block9_0_relu False
conv5_block9_1_conv False
conv5_block9_1_bn False
conv5_block9_1_relu False
conv5_block9_2_conv False
conv5_block9_concat False
conv5_block10_0_bn False
conv5_block10_0_relu False
conv5_block10_1_conv False
conv5_block10_1_bn False
conv5_block10_1_relu False
conv5_block10_2_conv False
conv5_block10_concat False
conv5_block11_0_bn False
conv5_block11_0_relu False
conv5_block11_1_conv False
conv5_block11_1_bn False
conv5_block11_1_relu False
conv5_block11_2_conv False
conv5_block11_concat False
conv5_block12_0_bn False
conv5_block12_0_relu False
conv5_block12_1_conv False
conv5_block12_1_bn False
conv5_block12_1_relu False
conv5_block12_2_conv False
conv5_block12_concat False
conv5_block13_0_bn False
conv5_block13_0_relu False
conv5_block13_1_conv False
conv5_block13_1_bn False
conv5_block13_1_relu False
conv5_block13_2_conv False
conv5_block13_concat False
conv5_block14_0_bn False
conv5_block14_0_relu False
conv5_block14_1_conv False
conv5_block14_1_bn False
conv5_block14_1_relu False
conv5_block14_2_conv False
conv5_block14_concat False
conv5_block15_0_bn False
conv5_block15_0_relu False
conv5_block15_1_conv False
conv5_block15_1_bn False
conv5_block15_1_relu False
conv5_block15_2_conv False
conv5_block15_concat False
conv5_block16_0_bn False
conv5_block16_0_relu False
conv5_block16_1_conv False
conv5_block16_1_bn False
conv5_block16_1_relu False
conv5_block16_2_conv False
conv5_block16_concat False
bn False
relu False
avg_pool True
fc1000 True
Saved model architecture under model_4.json
In [37]:
model_4_hist = train_model(model_4, 'model_4', train_gen, valid_gen, epochs=50)
Epoch 1/50
34/34 [==============================] - 60s 2s/step - loss: 0.6931 - binary_accuracy: 0.5164 - val_loss: 0.6933 - val_binary_accuracy: 0.5517

Epoch 00001: val_loss improved from inf to 0.69332, saving model to model_4.best.hdf5
Epoch 2/50
34/34 [==============================] - 49s 1s/step - loss: 0.6926 - binary_accuracy: 0.5645 - val_loss: 0.6926 - val_binary_accuracy: 0.5862

Epoch 00002: val_loss improved from 0.69332 to 0.69262, saving model to model_4.best.hdf5
Epoch 3/50
34/34 [==============================] - 52s 2s/step - loss: 0.6912 - binary_accuracy: 0.5935 - val_loss: 0.6894 - val_binary_accuracy: 0.5747

Epoch 00003: val_loss improved from 0.69262 to 0.68940, saving model to model_4.best.hdf5
Epoch 4/50
34/34 [==============================] - 53s 2s/step - loss: 0.6883 - binary_accuracy: 0.6126 - val_loss: 0.6838 - val_binary_accuracy: 0.6149

Epoch 00004: val_loss improved from 0.68940 to 0.68385, saving model to model_4.best.hdf5
Epoch 5/50
34/34 [==============================] - 53s 2s/step - loss: 0.6831 - binary_accuracy: 0.6084 - val_loss: 0.6723 - val_binary_accuracy: 0.5920

Epoch 00005: val_loss improved from 0.68385 to 0.67232, saving model to model_4.best.hdf5
Epoch 6/50
34/34 [==============================] - 53s 2s/step - loss: 0.6761 - binary_accuracy: 0.6178 - val_loss: 0.6669 - val_binary_accuracy: 0.5460

Epoch 00006: val_loss improved from 0.67232 to 0.66688, saving model to model_4.best.hdf5
Epoch 7/50
34/34 [==============================] - 52s 2s/step - loss: 0.6674 - binary_accuracy: 0.6210 - val_loss: 0.6583 - val_binary_accuracy: 0.5632

Epoch 00007: val_loss improved from 0.66688 to 0.65832, saving model to model_4.best.hdf5
Epoch 8/50
34/34 [==============================] - 53s 2s/step - loss: 0.6593 - binary_accuracy: 0.6313 - val_loss: 0.6593 - val_binary_accuracy: 0.5517

Epoch 00008: val_loss did not improve from 0.65832
Epoch 9/50
34/34 [==============================] - 53s 2s/step - loss: 0.6553 - binary_accuracy: 0.6294 - val_loss: 0.6418 - val_binary_accuracy: 0.5690

Epoch 00009: val_loss improved from 0.65832 to 0.64181, saving model to model_4.best.hdf5
Epoch 10/50
34/34 [==============================] - 53s 2s/step - loss: 0.6502 - binary_accuracy: 0.6397 - val_loss: 0.6600 - val_binary_accuracy: 0.5805

Epoch 00010: val_loss did not improve from 0.64181
Epoch 11/50
34/34 [==============================] - 53s 2s/step - loss: 0.6493 - binary_accuracy: 0.6252 - val_loss: 0.6532 - val_binary_accuracy: 0.5690

Epoch 00011: val_loss did not improve from 0.64181
Epoch 12/50
34/34 [==============================] - 53s 2s/step - loss: 0.6457 - binary_accuracy: 0.6308 - val_loss: 0.6367 - val_binary_accuracy: 0.5862

Epoch 00012: val_loss improved from 0.64181 to 0.63669, saving model to model_4.best.hdf5
Epoch 13/50
34/34 [==============================] - 56s 2s/step - loss: 0.6399 - binary_accuracy: 0.6444 - val_loss: 0.6397 - val_binary_accuracy: 0.5862

Epoch 00013: val_loss did not improve from 0.63669
Epoch 14/50
34/34 [==============================] - 56s 2s/step - loss: 0.6346 - binary_accuracy: 0.6467 - val_loss: 0.6394 - val_binary_accuracy: 0.5920

Epoch 00014: val_loss did not improve from 0.63669
Epoch 15/50
34/34 [==============================] - 56s 2s/step - loss: 0.6357 - binary_accuracy: 0.6453 - val_loss: 0.6395 - val_binary_accuracy: 0.5977

Epoch 00015: val_loss did not improve from 0.63669
Epoch 16/50
34/34 [==============================] - 56s 2s/step - loss: 0.6417 - binary_accuracy: 0.6411 - val_loss: 0.6326 - val_binary_accuracy: 0.6034

Epoch 00016: val_loss improved from 0.63669 to 0.63263, saving model to model_4.best.hdf5
Epoch 17/50
34/34 [==============================] - 56s 2s/step - loss: 0.6352 - binary_accuracy: 0.6514 - val_loss: 0.6344 - val_binary_accuracy: 0.5977

Epoch 00017: val_loss did not improve from 0.63263
Epoch 18/50
34/34 [==============================] - 56s 2s/step - loss: 0.6261 - binary_accuracy: 0.6547 - val_loss: 0.6382 - val_binary_accuracy: 0.5920

Epoch 00018: val_loss did not improve from 0.63263
Epoch 19/50
34/34 [==============================] - 55s 2s/step - loss: 0.6278 - binary_accuracy: 0.6565 - val_loss: 0.6430 - val_binary_accuracy: 0.5977

Epoch 00019: val_loss did not improve from 0.63263
Epoch 20/50
34/34 [==============================] - 56s 2s/step - loss: 0.6290 - binary_accuracy: 0.6570 - val_loss: 0.6447 - val_binary_accuracy: 0.6034

Epoch 00020: val_loss did not improve from 0.63263
Epoch 21/50
34/34 [==============================] - 55s 2s/step - loss: 0.6208 - binary_accuracy: 0.6561 - val_loss: 0.6413 - val_binary_accuracy: 0.6092

Epoch 00021: val_loss did not improve from 0.63263
Epoch 22/50
34/34 [==============================] - 55s 2s/step - loss: 0.6220 - binary_accuracy: 0.6579 - val_loss: 0.6450 - val_binary_accuracy: 0.6034

Epoch 00022: val_loss did not improve from 0.63263
Epoch 23/50
34/34 [==============================] - 55s 2s/step - loss: 0.6207 - binary_accuracy: 0.6626 - val_loss: 0.6449 - val_binary_accuracy: 0.5690

Epoch 00023: val_loss did not improve from 0.63263
Epoch 24/50
34/34 [==============================] - 56s 2s/step - loss: 0.6247 - binary_accuracy: 0.6514 - val_loss: 0.6403 - val_binary_accuracy: 0.5747

Epoch 00024: val_loss did not improve from 0.63263
Epoch 25/50
34/34 [==============================] - 55s 2s/step - loss: 0.6221 - binary_accuracy: 0.6706 - val_loss: 0.6385 - val_binary_accuracy: 0.5862

Epoch 00025: val_loss did not improve from 0.63263
Epoch 26/50
34/34 [==============================] - 56s 2s/step - loss: 0.6157 - binary_accuracy: 0.6668 - val_loss: 0.6383 - val_binary_accuracy: 0.5690

Epoch 00026: val_loss did not improve from 0.63263
Epoch 27/50
34/34 [==============================] - 56s 2s/step - loss: 0.6103 - binary_accuracy: 0.6790 - val_loss: 0.6021 - val_binary_accuracy: 0.5920

Epoch 00027: val_loss improved from 0.63263 to 0.60208, saving model to model_4.best.hdf5
Epoch 28/50
34/34 [==============================] - 56s 2s/step - loss: 0.6155 - binary_accuracy: 0.6645 - val_loss: 0.6117 - val_binary_accuracy: 0.5747

Epoch 00028: val_loss did not improve from 0.60208
Epoch 29/50
34/34 [==============================] - 56s 2s/step - loss: 0.6101 - binary_accuracy: 0.6706 - val_loss: 0.6063 - val_binary_accuracy: 0.5747

Epoch 00029: val_loss did not improve from 0.60208
Epoch 30/50
34/34 [==============================] - 55s 2s/step - loss: 0.6066 - binary_accuracy: 0.6682 - val_loss: 0.6125 - val_binary_accuracy: 0.5977

Epoch 00030: val_loss did not improve from 0.60208
Epoch 31/50
34/34 [==============================] - 56s 2s/step - loss: 0.6143 - binary_accuracy: 0.6701 - val_loss: 0.6218 - val_binary_accuracy: 0.5920

Epoch 00031: val_loss did not improve from 0.60208
Epoch 32/50
34/34 [==============================] - 56s 2s/step - loss: 0.6035 - binary_accuracy: 0.6813 - val_loss: 0.6216 - val_binary_accuracy: 0.5920

Epoch 00032: val_loss did not improve from 0.60208
Epoch 33/50
34/34 [==============================] - 55s 2s/step - loss: 0.5921 - binary_accuracy: 0.6977 - val_loss: 0.6690 - val_binary_accuracy: 0.5575

Epoch 00033: val_loss did not improve from 0.60208
Epoch 34/50
34/34 [==============================] - 55s 2s/step - loss: 0.6008 - binary_accuracy: 0.6724 - val_loss: 0.6652 - val_binary_accuracy: 0.5805

Epoch 00034: val_loss did not improve from 0.60208
Epoch 35/50
34/34 [==============================] - 56s 2s/step - loss: 0.6012 - binary_accuracy: 0.6935 - val_loss: 0.6318 - val_binary_accuracy: 0.5747

Epoch 00035: val_loss did not improve from 0.60208
Epoch 36/50
34/34 [==============================] - 56s 2s/step - loss: 0.5967 - binary_accuracy: 0.6804 - val_loss: 0.6810 - val_binary_accuracy: 0.5862

Epoch 00036: val_loss did not improve from 0.60208
Epoch 37/50
34/34 [==============================] - 56s 2s/step - loss: 0.5928 - binary_accuracy: 0.6883 - val_loss: 0.6461 - val_binary_accuracy: 0.5862

Epoch 00037: val_loss did not improve from 0.60208
Epoch 38/50
34/34 [==============================] - 56s 2s/step - loss: 0.5936 - binary_accuracy: 0.6944 - val_loss: 0.6097 - val_binary_accuracy: 0.5920

Epoch 00038: val_loss did not improve from 0.60208
Epoch 39/50
34/34 [==============================] - 56s 2s/step - loss: 0.5901 - binary_accuracy: 0.6869 - val_loss: 0.6460 - val_binary_accuracy: 0.5920

Epoch 00039: val_loss did not improve from 0.60208
Epoch 40/50
34/34 [==============================] - 56s 2s/step - loss: 0.5875 - binary_accuracy: 0.6874 - val_loss: 0.6643 - val_binary_accuracy: 0.5862

Epoch 00040: val_loss did not improve from 0.60208
Epoch 41/50
34/34 [==============================] - 55s 2s/step - loss: 0.5832 - binary_accuracy: 0.6916 - val_loss: 0.6327 - val_binary_accuracy: 0.5862

Epoch 00041: val_loss did not improve from 0.60208
Epoch 42/50
34/34 [==============================] - 55s 2s/step - loss: 0.5888 - binary_accuracy: 0.6888 - val_loss: 0.6467 - val_binary_accuracy: 0.5747

Epoch 00042: val_loss did not improve from 0.60208
In [38]:
pred_Y_4, ground_truth_4, evaluation_dic_4 = predict_and_evaluate_model(model_4, "model_4", test_gen, steps=len(test_df)/64)
22/21 [==============================] - 23s 1s/step
Model prediction min: 0.0671
Model prediction max: 0.932
  • The validation loss was decreasing at till circa 15 epoch, afterwards it was very noisy. The validation accuracy was increasing until 5th epoch, afterwards it was very noisy and lower than training accuracy (best model weights were saved after 27th epoch).
  • The prediction distibution is wide and exhibits two peaks, each for one class. The peak near 1.0, i.e. pneumonia positive class, has less counts, then the peak near 0.0, i.e pneumonia negative class, in agreement with the test set comprising of 20% of positive pneumonia cases.
  • The AUC amounts to 0.57.

In the next model only prediction layer will be attached to the pretrained model, to decrease its complexity and thus avoiding overfitting.

Model 5

In [12]:
# The same as model 4 with only prediction layer attached to the pretrained model

# Model 5: 
## transfer layer: avg_pool (idx 427)
## learning rate E-4

model_5_pretrained = load_pretrained_model(pretrained_model=DenseNet121, transfer_layer='avg_pool', 
                          transfer_layer_idx=427)
model_5 = build_simpler_model(model_5_pretrained)
save_model(model_5, 'model_5')
Pre-trained model layers and their trainability
input_1 False
zero_padding2d_1 False
conv1/conv False
conv1/bn False
conv1/relu False
zero_padding2d_2 False
pool1 False
conv2_block1_0_bn False
conv2_block1_0_relu False
conv2_block1_1_conv False
conv2_block1_1_bn False
conv2_block1_1_relu False
conv2_block1_2_conv False
conv2_block1_concat False
conv2_block2_0_bn False
conv2_block2_0_relu False
conv2_block2_1_conv False
conv2_block2_1_bn False
conv2_block2_1_relu False
conv2_block2_2_conv False
conv2_block2_concat False
conv2_block3_0_bn False
conv2_block3_0_relu False
conv2_block3_1_conv False
conv2_block3_1_bn False
conv2_block3_1_relu False
conv2_block3_2_conv False
conv2_block3_concat False
conv2_block4_0_bn False
conv2_block4_0_relu False
conv2_block4_1_conv False
conv2_block4_1_bn False
conv2_block4_1_relu False
conv2_block4_2_conv False
conv2_block4_concat False
conv2_block5_0_bn False
conv2_block5_0_relu False
conv2_block5_1_conv False
conv2_block5_1_bn False
conv2_block5_1_relu False
conv2_block5_2_conv False
conv2_block5_concat False
conv2_block6_0_bn False
conv2_block6_0_relu False
conv2_block6_1_conv False
conv2_block6_1_bn False
conv2_block6_1_relu False
conv2_block6_2_conv False
conv2_block6_concat False
pool2_bn False
pool2_relu False
pool2_conv False
pool2_pool False
conv3_block1_0_bn False
conv3_block1_0_relu False
conv3_block1_1_conv False
conv3_block1_1_bn False
conv3_block1_1_relu False
conv3_block1_2_conv False
conv3_block1_concat False
conv3_block2_0_bn False
conv3_block2_0_relu False
conv3_block2_1_conv False
conv3_block2_1_bn False
conv3_block2_1_relu False
conv3_block2_2_conv False
conv3_block2_concat False
conv3_block3_0_bn False
conv3_block3_0_relu False
conv3_block3_1_conv False
conv3_block3_1_bn False
conv3_block3_1_relu False
conv3_block3_2_conv False
conv3_block3_concat False
conv3_block4_0_bn False
conv3_block4_0_relu False
conv3_block4_1_conv False
conv3_block4_1_bn False
conv3_block4_1_relu False
conv3_block4_2_conv False
conv3_block4_concat False
conv3_block5_0_bn False
conv3_block5_0_relu False
conv3_block5_1_conv False
conv3_block5_1_bn False
conv3_block5_1_relu False
conv3_block5_2_conv False
conv3_block5_concat False
conv3_block6_0_bn False
conv3_block6_0_relu False
conv3_block6_1_conv False
conv3_block6_1_bn False
conv3_block6_1_relu False
conv3_block6_2_conv False
conv3_block6_concat False
conv3_block7_0_bn False
conv3_block7_0_relu False
conv3_block7_1_conv False
conv3_block7_1_bn False
conv3_block7_1_relu False
conv3_block7_2_conv False
conv3_block7_concat False
conv3_block8_0_bn False
conv3_block8_0_relu False
conv3_block8_1_conv False
conv3_block8_1_bn False
conv3_block8_1_relu False
conv3_block8_2_conv False
conv3_block8_concat False
conv3_block9_0_bn False
conv3_block9_0_relu False
conv3_block9_1_conv False
conv3_block9_1_bn False
conv3_block9_1_relu False
conv3_block9_2_conv False
conv3_block9_concat False
conv3_block10_0_bn False
conv3_block10_0_relu False
conv3_block10_1_conv False
conv3_block10_1_bn False
conv3_block10_1_relu False
conv3_block10_2_conv False
conv3_block10_concat False
conv3_block11_0_bn False
conv3_block11_0_relu False
conv3_block11_1_conv False
conv3_block11_1_bn False
conv3_block11_1_relu False
conv3_block11_2_conv False
conv3_block11_concat False
conv3_block12_0_bn False
conv3_block12_0_relu False
conv3_block12_1_conv False
conv3_block12_1_bn False
conv3_block12_1_relu False
conv3_block12_2_conv False
conv3_block12_concat False
pool3_bn False
pool3_relu False
pool3_conv False
pool3_pool False
conv4_block1_0_bn False
conv4_block1_0_relu False
conv4_block1_1_conv False
conv4_block1_1_bn False
conv4_block1_1_relu False
conv4_block1_2_conv False
conv4_block1_concat False
conv4_block2_0_bn False
conv4_block2_0_relu False
conv4_block2_1_conv False
conv4_block2_1_bn False
conv4_block2_1_relu False
conv4_block2_2_conv False
conv4_block2_concat False
conv4_block3_0_bn False
conv4_block3_0_relu False
conv4_block3_1_conv False
conv4_block3_1_bn False
conv4_block3_1_relu False
conv4_block3_2_conv False
conv4_block3_concat False
conv4_block4_0_bn False
conv4_block4_0_relu False
conv4_block4_1_conv False
conv4_block4_1_bn False
conv4_block4_1_relu False
conv4_block4_2_conv False
conv4_block4_concat False
conv4_block5_0_bn False
conv4_block5_0_relu False
conv4_block5_1_conv False
conv4_block5_1_bn False
conv4_block5_1_relu False
conv4_block5_2_conv False
conv4_block5_concat False
conv4_block6_0_bn False
conv4_block6_0_relu False
conv4_block6_1_conv False
conv4_block6_1_bn False
conv4_block6_1_relu False
conv4_block6_2_conv False
conv4_block6_concat False
conv4_block7_0_bn False
conv4_block7_0_relu False
conv4_block7_1_conv False
conv4_block7_1_bn False
conv4_block7_1_relu False
conv4_block7_2_conv False
conv4_block7_concat False
conv4_block8_0_bn False
conv4_block8_0_relu False
conv4_block8_1_conv False
conv4_block8_1_bn False
conv4_block8_1_relu False
conv4_block8_2_conv False
conv4_block8_concat False
conv4_block9_0_bn False
conv4_block9_0_relu False
conv4_block9_1_conv False
conv4_block9_1_bn False
conv4_block9_1_relu False
conv4_block9_2_conv False
conv4_block9_concat False
conv4_block10_0_bn False
conv4_block10_0_relu False
conv4_block10_1_conv False
conv4_block10_1_bn False
conv4_block10_1_relu False
conv4_block10_2_conv False
conv4_block10_concat False
conv4_block11_0_bn False
conv4_block11_0_relu False
conv4_block11_1_conv False
conv4_block11_1_bn False
conv4_block11_1_relu False
conv4_block11_2_conv False
conv4_block11_concat False
conv4_block12_0_bn False
conv4_block12_0_relu False
conv4_block12_1_conv False
conv4_block12_1_bn False
conv4_block12_1_relu False
conv4_block12_2_conv False
conv4_block12_concat False
conv4_block13_0_bn False
conv4_block13_0_relu False
conv4_block13_1_conv False
conv4_block13_1_bn False
conv4_block13_1_relu False
conv4_block13_2_conv False
conv4_block13_concat False
conv4_block14_0_bn False
conv4_block14_0_relu False
conv4_block14_1_conv False
conv4_block14_1_bn False
conv4_block14_1_relu False
conv4_block14_2_conv False
conv4_block14_concat False
conv4_block15_0_bn False
conv4_block15_0_relu False
conv4_block15_1_conv False
conv4_block15_1_bn False
conv4_block15_1_relu False
conv4_block15_2_conv False
conv4_block15_concat False
conv4_block16_0_bn False
conv4_block16_0_relu False
conv4_block16_1_conv False
conv4_block16_1_bn False
conv4_block16_1_relu False
conv4_block16_2_conv False
conv4_block16_concat False
conv4_block17_0_bn False
conv4_block17_0_relu False
conv4_block17_1_conv False
conv4_block17_1_bn False
conv4_block17_1_relu False
conv4_block17_2_conv False
conv4_block17_concat False
conv4_block18_0_bn False
conv4_block18_0_relu False
conv4_block18_1_conv False
conv4_block18_1_bn False
conv4_block18_1_relu False
conv4_block18_2_conv False
conv4_block18_concat False
conv4_block19_0_bn False
conv4_block19_0_relu False
conv4_block19_1_conv False
conv4_block19_1_bn False
conv4_block19_1_relu False
conv4_block19_2_conv False
conv4_block19_concat False
conv4_block20_0_bn False
conv4_block20_0_relu False
conv4_block20_1_conv False
conv4_block20_1_bn False
conv4_block20_1_relu False
conv4_block20_2_conv False
conv4_block20_concat False
conv4_block21_0_bn False
conv4_block21_0_relu False
conv4_block21_1_conv False
conv4_block21_1_bn False
conv4_block21_1_relu False
conv4_block21_2_conv False
conv4_block21_concat False
conv4_block22_0_bn False
conv4_block22_0_relu False
conv4_block22_1_conv False
conv4_block22_1_bn False
conv4_block22_1_relu False
conv4_block22_2_conv False
conv4_block22_concat False
conv4_block23_0_bn False
conv4_block23_0_relu False
conv4_block23_1_conv False
conv4_block23_1_bn False
conv4_block23_1_relu False
conv4_block23_2_conv False
conv4_block23_concat False
conv4_block24_0_bn False
conv4_block24_0_relu False
conv4_block24_1_conv False
conv4_block24_1_bn False
conv4_block24_1_relu False
conv4_block24_2_conv False
conv4_block24_concat False
pool4_bn False
pool4_relu False
pool4_conv False
pool4_pool False
conv5_block1_0_bn False
conv5_block1_0_relu False
conv5_block1_1_conv False
conv5_block1_1_bn False
conv5_block1_1_relu False
conv5_block1_2_conv False
conv5_block1_concat False
conv5_block2_0_bn False
conv5_block2_0_relu False
conv5_block2_1_conv False
conv5_block2_1_bn False
conv5_block2_1_relu False
conv5_block2_2_conv False
conv5_block2_concat False
conv5_block3_0_bn False
conv5_block3_0_relu False
conv5_block3_1_conv False
conv5_block3_1_bn False
conv5_block3_1_relu False
conv5_block3_2_conv False
conv5_block3_concat False
conv5_block4_0_bn False
conv5_block4_0_relu False
conv5_block4_1_conv False
conv5_block4_1_bn False
conv5_block4_1_relu False
conv5_block4_2_conv False
conv5_block4_concat False
conv5_block5_0_bn False
conv5_block5_0_relu False
conv5_block5_1_conv False
conv5_block5_1_bn False
conv5_block5_1_relu False
conv5_block5_2_conv False
conv5_block5_concat False
conv5_block6_0_bn False
conv5_block6_0_relu False
conv5_block6_1_conv False
conv5_block6_1_bn False
conv5_block6_1_relu False
conv5_block6_2_conv False
conv5_block6_concat False
conv5_block7_0_bn False
conv5_block7_0_relu False
conv5_block7_1_conv False
conv5_block7_1_bn False
conv5_block7_1_relu False
conv5_block7_2_conv False
conv5_block7_concat False
conv5_block8_0_bn False
conv5_block8_0_relu False
conv5_block8_1_conv False
conv5_block8_1_bn False
conv5_block8_1_relu False
conv5_block8_2_conv False
conv5_block8_concat False
conv5_block9_0_bn False
conv5_block9_0_relu False
conv5_block9_1_conv False
conv5_block9_1_bn False
conv5_block9_1_relu False
conv5_block9_2_conv False
conv5_block9_concat False
conv5_block10_0_bn False
conv5_block10_0_relu False
conv5_block10_1_conv False
conv5_block10_1_bn False
conv5_block10_1_relu False
conv5_block10_2_conv False
conv5_block10_concat False
conv5_block11_0_bn False
conv5_block11_0_relu False
conv5_block11_1_conv False
conv5_block11_1_bn False
conv5_block11_1_relu False
conv5_block11_2_conv False
conv5_block11_concat False
conv5_block12_0_bn False
conv5_block12_0_relu False
conv5_block12_1_conv False
conv5_block12_1_bn False
conv5_block12_1_relu False
conv5_block12_2_conv False
conv5_block12_concat False
conv5_block13_0_bn False
conv5_block13_0_relu False
conv5_block13_1_conv False
conv5_block13_1_bn False
conv5_block13_1_relu False
conv5_block13_2_conv False
conv5_block13_concat False
conv5_block14_0_bn False
conv5_block14_0_relu False
conv5_block14_1_conv False
conv5_block14_1_bn False
conv5_block14_1_relu False
conv5_block14_2_conv False
conv5_block14_concat False
conv5_block15_0_bn False
conv5_block15_0_relu False
conv5_block15_1_conv False
conv5_block15_1_bn False
conv5_block15_1_relu False
conv5_block15_2_conv False
conv5_block15_concat False
conv5_block16_0_bn False
conv5_block16_0_relu False
conv5_block16_1_conv False
conv5_block16_1_bn False
conv5_block16_1_relu False
conv5_block16_2_conv False
conv5_block16_concat False
bn False
relu False
avg_pool True
fc1000 True
Saved model architecture under model_5.json
In [13]:
model_5_hist = train_model(model_5, 'model_5', train_gen, valid_gen, epochs=100)
Epoch 1/100
34/34 [==============================] - 65s 2s/step - loss: 0.6930 - binary_accuracy: 0.5248 - val_loss: 0.6926 - val_binary_accuracy: 0.5632

Epoch 00001: val_loss improved from inf to 0.69265, saving model to model_5.best.hdf5
Epoch 2/100
34/34 [==============================] - 49s 1s/step - loss: 0.6925 - binary_accuracy: 0.5636 - val_loss: 0.6922 - val_binary_accuracy: 0.5402

Epoch 00002: val_loss improved from 0.69265 to 0.69215, saving model to model_5.best.hdf5
Epoch 3/100
34/34 [==============================] - 54s 2s/step - loss: 0.6918 - binary_accuracy: 0.5935 - val_loss: 0.6919 - val_binary_accuracy: 0.5230

Epoch 00003: val_loss improved from 0.69215 to 0.69186, saving model to model_5.best.hdf5
Epoch 4/100
34/34 [==============================] - 50s 1s/step - loss: 0.6908 - binary_accuracy: 0.5893 - val_loss: 0.6918 - val_binary_accuracy: 0.5517

Epoch 00004: val_loss improved from 0.69186 to 0.69176, saving model to model_5.best.hdf5
Epoch 5/100
34/34 [==============================] - 53s 2s/step - loss: 0.6898 - binary_accuracy: 0.6023 - val_loss: 0.6914 - val_binary_accuracy: 0.5575

Epoch 00005: val_loss improved from 0.69176 to 0.69145, saving model to model_5.best.hdf5
Epoch 6/100
34/34 [==============================] - 50s 1s/step - loss: 0.6893 - binary_accuracy: 0.6089 - val_loss: 0.6915 - val_binary_accuracy: 0.5517

Epoch 00006: val_loss did not improve from 0.69145
Epoch 7/100
34/34 [==============================] - 52s 2s/step - loss: 0.6881 - binary_accuracy: 0.6159 - val_loss: 0.6910 - val_binary_accuracy: 0.5517

Epoch 00007: val_loss improved from 0.69145 to 0.69096, saving model to model_5.best.hdf5
Epoch 8/100
34/34 [==============================] - 52s 2s/step - loss: 0.6874 - binary_accuracy: 0.6173 - val_loss: 0.6909 - val_binary_accuracy: 0.5402

Epoch 00008: val_loss improved from 0.69096 to 0.69087, saving model to model_5.best.hdf5
Epoch 9/100
34/34 [==============================] - 52s 2s/step - loss: 0.6870 - binary_accuracy: 0.6121 - val_loss: 0.6901 - val_binary_accuracy: 0.5287

Epoch 00009: val_loss improved from 0.69087 to 0.69014, saving model to model_5.best.hdf5
Epoch 10/100
34/34 [==============================] - 52s 2s/step - loss: 0.6864 - binary_accuracy: 0.6178 - val_loss: 0.6897 - val_binary_accuracy: 0.5230

Epoch 00010: val_loss improved from 0.69014 to 0.68973, saving model to model_5.best.hdf5
Epoch 11/100
34/34 [==============================] - 52s 2s/step - loss: 0.6857 - binary_accuracy: 0.6196 - val_loss: 0.6899 - val_binary_accuracy: 0.5115

Epoch 00011: val_loss did not improve from 0.68973
Epoch 12/100
34/34 [==============================] - 53s 2s/step - loss: 0.6854 - binary_accuracy: 0.6182 - val_loss: 0.6898 - val_binary_accuracy: 0.5345

Epoch 00012: val_loss did not improve from 0.68973
Epoch 13/100
34/34 [==============================] - 52s 2s/step - loss: 0.6850 - binary_accuracy: 0.6140 - val_loss: 0.6895 - val_binary_accuracy: 0.5575

Epoch 00013: val_loss improved from 0.68973 to 0.68955, saving model to model_5.best.hdf5
Epoch 14/100
34/34 [==============================] - 51s 2s/step - loss: 0.6847 - binary_accuracy: 0.6075 - val_loss: 0.6892 - val_binary_accuracy: 0.5345

Epoch 00014: val_loss improved from 0.68955 to 0.68917, saving model to model_5.best.hdf5
Epoch 15/100
34/34 [==============================] - 52s 2s/step - loss: 0.6836 - binary_accuracy: 0.6234 - val_loss: 0.6893 - val_binary_accuracy: 0.5402

Epoch 00015: val_loss did not improve from 0.68917
Epoch 16/100
34/34 [==============================] - 52s 2s/step - loss: 0.6837 - binary_accuracy: 0.6206 - val_loss: 0.6892 - val_binary_accuracy: 0.5287

Epoch 00016: val_loss did not improve from 0.68917
Epoch 17/100
34/34 [==============================] - 54s 2s/step - loss: 0.6827 - binary_accuracy: 0.6196 - val_loss: 0.6897 - val_binary_accuracy: 0.5460

Epoch 00017: val_loss did not improve from 0.68917
Epoch 18/100
34/34 [==============================] - 53s 2s/step - loss: 0.6822 - binary_accuracy: 0.6262 - val_loss: 0.6899 - val_binary_accuracy: 0.5402

Epoch 00018: val_loss did not improve from 0.68917
Epoch 19/100
34/34 [==============================] - 53s 2s/step - loss: 0.6841 - binary_accuracy: 0.6019 - val_loss: 0.6894 - val_binary_accuracy: 0.5460

Epoch 00019: val_loss did not improve from 0.68917
Epoch 20/100
34/34 [==============================] - 52s 2s/step - loss: 0.6816 - binary_accuracy: 0.6168 - val_loss: 0.6892 - val_binary_accuracy: 0.5460

Epoch 00020: val_loss improved from 0.68917 to 0.68915, saving model to model_5.best.hdf5
Epoch 21/100
34/34 [==============================] - 51s 1s/step - loss: 0.6819 - binary_accuracy: 0.6243 - val_loss: 0.6890 - val_binary_accuracy: 0.5345

Epoch 00021: val_loss improved from 0.68915 to 0.68899, saving model to model_5.best.hdf5
Epoch 22/100
34/34 [==============================] - 52s 2s/step - loss: 0.6818 - binary_accuracy: 0.6051 - val_loss: 0.6886 - val_binary_accuracy: 0.5230

Epoch 00022: val_loss improved from 0.68899 to 0.68865, saving model to model_5.best.hdf5
Epoch 23/100
34/34 [==============================] - 50s 1s/step - loss: 0.6808 - binary_accuracy: 0.6192 - val_loss: 0.6885 - val_binary_accuracy: 0.5172

Epoch 00023: val_loss improved from 0.68865 to 0.68855, saving model to model_5.best.hdf5
Epoch 24/100
34/34 [==============================] - 51s 1s/step - loss: 0.6796 - binary_accuracy: 0.6266 - val_loss: 0.6884 - val_binary_accuracy: 0.5172

Epoch 00024: val_loss improved from 0.68855 to 0.68839, saving model to model_5.best.hdf5
Epoch 25/100
34/34 [==============================] - 51s 2s/step - loss: 0.6794 - binary_accuracy: 0.6280 - val_loss: 0.6883 - val_binary_accuracy: 0.5287

Epoch 00025: val_loss improved from 0.68839 to 0.68833, saving model to model_5.best.hdf5
Epoch 26/100
34/34 [==============================] - 52s 2s/step - loss: 0.6792 - binary_accuracy: 0.6159 - val_loss: 0.6884 - val_binary_accuracy: 0.5460

Epoch 00026: val_loss did not improve from 0.68833
Epoch 27/100
34/34 [==============================] - 51s 2s/step - loss: 0.6796 - binary_accuracy: 0.6262 - val_loss: 0.6880 - val_binary_accuracy: 0.5402

Epoch 00027: val_loss improved from 0.68833 to 0.68798, saving model to model_5.best.hdf5
Epoch 28/100
34/34 [==============================] - 51s 2s/step - loss: 0.6784 - binary_accuracy: 0.6234 - val_loss: 0.6877 - val_binary_accuracy: 0.5287

Epoch 00028: val_loss improved from 0.68798 to 0.68767, saving model to model_5.best.hdf5
Epoch 29/100
34/34 [==============================] - 53s 2s/step - loss: 0.6779 - binary_accuracy: 0.6360 - val_loss: 0.6877 - val_binary_accuracy: 0.5460

Epoch 00029: val_loss did not improve from 0.68767
Epoch 30/100
34/34 [==============================] - 51s 2s/step - loss: 0.6775 - binary_accuracy: 0.6276 - val_loss: 0.6871 - val_binary_accuracy: 0.5460

Epoch 00030: val_loss improved from 0.68767 to 0.68713, saving model to model_5.best.hdf5
Epoch 31/100
34/34 [==============================] - 51s 1s/step - loss: 0.6772 - binary_accuracy: 0.6271 - val_loss: 0.6866 - val_binary_accuracy: 0.5287

Epoch 00031: val_loss improved from 0.68713 to 0.68664, saving model to model_5.best.hdf5
Epoch 32/100
34/34 [==============================] - 51s 2s/step - loss: 0.6771 - binary_accuracy: 0.6201 - val_loss: 0.6869 - val_binary_accuracy: 0.5517

Epoch 00032: val_loss did not improve from 0.68664
Epoch 33/100
34/34 [==============================] - 52s 2s/step - loss: 0.6777 - binary_accuracy: 0.6229 - val_loss: 0.6869 - val_binary_accuracy: 0.5402

Epoch 00033: val_loss did not improve from 0.68664
Epoch 34/100
34/34 [==============================] - 51s 2s/step - loss: 0.6752 - binary_accuracy: 0.6350 - val_loss: 0.6869 - val_binary_accuracy: 0.5402

Epoch 00034: val_loss did not improve from 0.68664
Epoch 35/100
34/34 [==============================] - 51s 2s/step - loss: 0.6768 - binary_accuracy: 0.6192 - val_loss: 0.6865 - val_binary_accuracy: 0.5345

Epoch 00035: val_loss improved from 0.68664 to 0.68646, saving model to model_5.best.hdf5
Epoch 36/100
34/34 [==============================] - 52s 2s/step - loss: 0.6748 - binary_accuracy: 0.6355 - val_loss: 0.6859 - val_binary_accuracy: 0.5517

Epoch 00036: val_loss improved from 0.68646 to 0.68586, saving model to model_5.best.hdf5
Epoch 37/100
34/34 [==============================] - 51s 1s/step - loss: 0.6742 - binary_accuracy: 0.6350 - val_loss: 0.6854 - val_binary_accuracy: 0.5517

Epoch 00037: val_loss improved from 0.68586 to 0.68536, saving model to model_5.best.hdf5
Epoch 38/100
34/34 [==============================] - 51s 2s/step - loss: 0.6733 - binary_accuracy: 0.6383 - val_loss: 0.6851 - val_binary_accuracy: 0.5517

Epoch 00038: val_loss improved from 0.68536 to 0.68511, saving model to model_5.best.hdf5
Epoch 39/100
34/34 [==============================] - 51s 2s/step - loss: 0.6751 - binary_accuracy: 0.6252 - val_loss: 0.6847 - val_binary_accuracy: 0.5575

Epoch 00039: val_loss improved from 0.68511 to 0.68469, saving model to model_5.best.hdf5
Epoch 40/100
34/34 [==============================] - 52s 2s/step - loss: 0.6754 - binary_accuracy: 0.6168 - val_loss: 0.6855 - val_binary_accuracy: 0.5460

Epoch 00040: val_loss did not improve from 0.68469
Epoch 41/100
34/34 [==============================] - 51s 1s/step - loss: 0.6737 - binary_accuracy: 0.6285 - val_loss: 0.6856 - val_binary_accuracy: 0.5402

Epoch 00041: val_loss did not improve from 0.68469
Epoch 42/100
34/34 [==============================] - 52s 2s/step - loss: 0.6726 - binary_accuracy: 0.6346 - val_loss: 0.6855 - val_binary_accuracy: 0.5402

Epoch 00042: val_loss did not improve from 0.68469
Epoch 43/100
34/34 [==============================] - 52s 2s/step - loss: 0.6725 - binary_accuracy: 0.6355 - val_loss: 0.6856 - val_binary_accuracy: 0.5345

Epoch 00043: val_loss did not improve from 0.68469
Epoch 44/100
34/34 [==============================] - 51s 2s/step - loss: 0.6723 - binary_accuracy: 0.6322 - val_loss: 0.6855 - val_binary_accuracy: 0.5287

Epoch 00044: val_loss did not improve from 0.68469
Epoch 45/100
34/34 [==============================] - 52s 2s/step - loss: 0.6727 - binary_accuracy: 0.6346 - val_loss: 0.6851 - val_binary_accuracy: 0.5460

Epoch 00045: val_loss did not improve from 0.68469
Epoch 46/100
34/34 [==============================] - 52s 2s/step - loss: 0.6718 - binary_accuracy: 0.6369 - val_loss: 0.6849 - val_binary_accuracy: 0.5575

Epoch 00046: val_loss did not improve from 0.68469
Epoch 47/100
34/34 [==============================] - 52s 2s/step - loss: 0.6725 - binary_accuracy: 0.6290 - val_loss: 0.6858 - val_binary_accuracy: 0.5287

Epoch 00047: val_loss did not improve from 0.68469
Epoch 48/100
34/34 [==============================] - 51s 2s/step - loss: 0.6710 - binary_accuracy: 0.6388 - val_loss: 0.6853 - val_binary_accuracy: 0.5230

Epoch 00048: val_loss did not improve from 0.68469
Epoch 49/100
34/34 [==============================] - 52s 2s/step - loss: 0.6709 - binary_accuracy: 0.6383 - val_loss: 0.6854 - val_binary_accuracy: 0.5287

Epoch 00049: val_loss did not improve from 0.68469
Epoch 50/100
34/34 [==============================] - 52s 2s/step - loss: 0.6699 - binary_accuracy: 0.6332 - val_loss: 0.6842 - val_binary_accuracy: 0.5632

Epoch 00050: val_loss improved from 0.68469 to 0.68419, saving model to model_5.best.hdf5
Epoch 51/100
34/34 [==============================] - 52s 2s/step - loss: 0.6714 - binary_accuracy: 0.6285 - val_loss: 0.6861 - val_binary_accuracy: 0.5402

Epoch 00051: val_loss did not improve from 0.68419
Epoch 52/100
34/34 [==============================] - 54s 2s/step - loss: 0.6695 - binary_accuracy: 0.6322 - val_loss: 0.6857 - val_binary_accuracy: 0.5287

Epoch 00052: val_loss did not improve from 0.68419
Epoch 53/100
34/34 [==============================] - 53s 2s/step - loss: 0.6696 - binary_accuracy: 0.6379 - val_loss: 0.6850 - val_binary_accuracy: 0.5345

Epoch 00053: val_loss did not improve from 0.68419
Epoch 54/100
34/34 [==============================] - 53s 2s/step - loss: 0.6697 - binary_accuracy: 0.6379 - val_loss: 0.6849 - val_binary_accuracy: 0.5345

Epoch 00054: val_loss did not improve from 0.68419
Epoch 55/100
34/34 [==============================] - 55s 2s/step - loss: 0.6695 - binary_accuracy: 0.6336 - val_loss: 0.6838 - val_binary_accuracy: 0.5575

Epoch 00055: val_loss improved from 0.68419 to 0.68380, saving model to model_5.best.hdf5
Epoch 56/100
34/34 [==============================] - 54s 2s/step - loss: 0.6674 - binary_accuracy: 0.6379 - val_loss: 0.6848 - val_binary_accuracy: 0.5345

Epoch 00056: val_loss did not improve from 0.68380
Epoch 57/100
34/34 [==============================] - 54s 2s/step - loss: 0.6695 - binary_accuracy: 0.6388 - val_loss: 0.6851 - val_binary_accuracy: 0.5345

Epoch 00057: val_loss did not improve from 0.68380
Epoch 58/100
34/34 [==============================] - 55s 2s/step - loss: 0.6667 - binary_accuracy: 0.6523 - val_loss: 0.6844 - val_binary_accuracy: 0.5402

Epoch 00058: val_loss did not improve from 0.68380
Epoch 59/100
34/34 [==============================] - 55s 2s/step - loss: 0.6698 - binary_accuracy: 0.6238 - val_loss: 0.6841 - val_binary_accuracy: 0.5402

Epoch 00059: val_loss did not improve from 0.68380
Epoch 60/100
34/34 [==============================] - 55s 2s/step - loss: 0.6658 - binary_accuracy: 0.6542 - val_loss: 0.6839 - val_binary_accuracy: 0.5402

Epoch 00060: val_loss did not improve from 0.68380
Epoch 61/100
34/34 [==============================] - 56s 2s/step - loss: 0.6675 - binary_accuracy: 0.6355 - val_loss: 0.6837 - val_binary_accuracy: 0.5402

Epoch 00061: val_loss improved from 0.68380 to 0.68368, saving model to model_5.best.hdf5
Epoch 62/100
34/34 [==============================] - 54s 2s/step - loss: 0.6665 - binary_accuracy: 0.6393 - val_loss: 0.6841 - val_binary_accuracy: 0.5402

Epoch 00062: val_loss did not improve from 0.68368
Epoch 63/100
34/34 [==============================] - 55s 2s/step - loss: 0.6662 - binary_accuracy: 0.6435 - val_loss: 0.6845 - val_binary_accuracy: 0.5402

Epoch 00063: val_loss did not improve from 0.68368
Epoch 64/100
34/34 [==============================] - 54s 2s/step - loss: 0.6652 - binary_accuracy: 0.6463 - val_loss: 0.6846 - val_binary_accuracy: 0.5345

Epoch 00064: val_loss did not improve from 0.68368
Epoch 65/100
34/34 [==============================] - 54s 2s/step - loss: 0.6666 - binary_accuracy: 0.6425 - val_loss: 0.6840 - val_binary_accuracy: 0.5402

Epoch 00065: val_loss did not improve from 0.68368
Epoch 66/100
34/34 [==============================] - 54s 2s/step - loss: 0.6670 - binary_accuracy: 0.6360 - val_loss: 0.6840 - val_binary_accuracy: 0.5460

Epoch 00066: val_loss did not improve from 0.68368
Epoch 67/100
34/34 [==============================] - 54s 2s/step - loss: 0.6660 - binary_accuracy: 0.6397 - val_loss: 0.6832 - val_binary_accuracy: 0.5402

Epoch 00067: val_loss improved from 0.68368 to 0.68317, saving model to model_5.best.hdf5
Epoch 68/100
34/34 [==============================] - 54s 2s/step - loss: 0.6663 - binary_accuracy: 0.6308 - val_loss: 0.6839 - val_binary_accuracy: 0.5460

Epoch 00068: val_loss did not improve from 0.68317
Epoch 69/100
34/34 [==============================] - 55s 2s/step - loss: 0.6652 - binary_accuracy: 0.6407 - val_loss: 0.6844 - val_binary_accuracy: 0.5402

Epoch 00069: val_loss did not improve from 0.68317
Epoch 70/100
34/34 [==============================] - 54s 2s/step - loss: 0.6633 - binary_accuracy: 0.6453 - val_loss: 0.6842 - val_binary_accuracy: 0.5460

Epoch 00070: val_loss did not improve from 0.68317
Epoch 71/100
34/34 [==============================] - 53s 2s/step - loss: 0.6637 - binary_accuracy: 0.6383 - val_loss: 0.6826 - val_binary_accuracy: 0.5517

Epoch 00071: val_loss improved from 0.68317 to 0.68255, saving model to model_5.best.hdf5
Epoch 72/100
34/34 [==============================] - 52s 2s/step - loss: 0.6623 - binary_accuracy: 0.6495 - val_loss: 0.6846 - val_binary_accuracy: 0.5460

Epoch 00072: val_loss did not improve from 0.68255
Epoch 73/100
34/34 [==============================] - 52s 2s/step - loss: 0.6655 - binary_accuracy: 0.6402 - val_loss: 0.6841 - val_binary_accuracy: 0.5460

Epoch 00073: val_loss did not improve from 0.68255
Epoch 74/100
34/34 [==============================] - 52s 2s/step - loss: 0.6636 - binary_accuracy: 0.6458 - val_loss: 0.6832 - val_binary_accuracy: 0.5402

Epoch 00074: val_loss did not improve from 0.68255
Epoch 75/100
34/34 [==============================] - 53s 2s/step - loss: 0.6636 - binary_accuracy: 0.6350 - val_loss: 0.6834 - val_binary_accuracy: 0.5402

Epoch 00075: val_loss did not improve from 0.68255
Epoch 76/100
34/34 [==============================] - 52s 2s/step - loss: 0.6647 - binary_accuracy: 0.6318 - val_loss: 0.6822 - val_binary_accuracy: 0.5632

Epoch 00076: val_loss improved from 0.68255 to 0.68223, saving model to model_5.best.hdf5
Epoch 77/100
34/34 [==============================] - 51s 1s/step - loss: 0.6622 - binary_accuracy: 0.6388 - val_loss: 0.6822 - val_binary_accuracy: 0.5632

Epoch 00077: val_loss improved from 0.68223 to 0.68217, saving model to model_5.best.hdf5
Epoch 78/100
34/34 [==============================] - 51s 2s/step - loss: 0.6640 - binary_accuracy: 0.6374 - val_loss: 0.6831 - val_binary_accuracy: 0.5402

Epoch 00078: val_loss did not improve from 0.68217
Epoch 79/100
34/34 [==============================] - 53s 2s/step - loss: 0.6602 - binary_accuracy: 0.6477 - val_loss: 0.6838 - val_binary_accuracy: 0.5460

Epoch 00079: val_loss did not improve from 0.68217
Epoch 80/100
34/34 [==============================] - 51s 2s/step - loss: 0.6629 - binary_accuracy: 0.6402 - val_loss: 0.6838 - val_binary_accuracy: 0.5460

Epoch 00080: val_loss did not improve from 0.68217
Epoch 81/100
34/34 [==============================] - 52s 2s/step - loss: 0.6597 - binary_accuracy: 0.6481 - val_loss: 0.6828 - val_binary_accuracy: 0.5460

Epoch 00081: val_loss did not improve from 0.68217
Epoch 82/100
34/34 [==============================] - 51s 2s/step - loss: 0.6634 - binary_accuracy: 0.6374 - val_loss: 0.6840 - val_binary_accuracy: 0.5460

Epoch 00082: val_loss did not improve from 0.68217
Epoch 83/100
34/34 [==============================] - 52s 2s/step - loss: 0.6599 - binary_accuracy: 0.6495 - val_loss: 0.6825 - val_binary_accuracy: 0.5575

Epoch 00083: val_loss did not improve from 0.68217
Epoch 84/100
34/34 [==============================] - 50s 1s/step - loss: 0.6601 - binary_accuracy: 0.6523 - val_loss: 0.6833 - val_binary_accuracy: 0.5517

Epoch 00084: val_loss did not improve from 0.68217
Epoch 85/100
34/34 [==============================] - 51s 1s/step - loss: 0.6612 - binary_accuracy: 0.6467 - val_loss: 0.6828 - val_binary_accuracy: 0.5575

Epoch 00085: val_loss did not improve from 0.68217
Epoch 86/100
34/34 [==============================] - 52s 2s/step - loss: 0.6607 - binary_accuracy: 0.6439 - val_loss: 0.6835 - val_binary_accuracy: 0.5517

Epoch 00086: val_loss did not improve from 0.68217
Epoch 87/100
34/34 [==============================] - 50s 1s/step - loss: 0.6584 - binary_accuracy: 0.6565 - val_loss: 0.6822 - val_binary_accuracy: 0.5690

Epoch 00087: val_loss did not improve from 0.68217
Epoch 88/100
34/34 [==============================] - 51s 2s/step - loss: 0.6595 - binary_accuracy: 0.6514 - val_loss: 0.6827 - val_binary_accuracy: 0.5575

Epoch 00088: val_loss did not improve from 0.68217
Epoch 89/100
34/34 [==============================] - 51s 2s/step - loss: 0.6584 - binary_accuracy: 0.6477 - val_loss: 0.6818 - val_binary_accuracy: 0.5575

Epoch 00089: val_loss improved from 0.68217 to 0.68176, saving model to model_5.best.hdf5
Epoch 90/100
34/34 [==============================] - 50s 1s/step - loss: 0.6579 - binary_accuracy: 0.6458 - val_loss: 0.6823 - val_binary_accuracy: 0.5632

Epoch 00090: val_loss did not improve from 0.68176
Epoch 91/100
34/34 [==============================] - 49s 1s/step - loss: 0.6590 - binary_accuracy: 0.6481 - val_loss: 0.6836 - val_binary_accuracy: 0.5517

Epoch 00091: val_loss did not improve from 0.68176
Epoch 92/100
34/34 [==============================] - 52s 2s/step - loss: 0.6576 - binary_accuracy: 0.6519 - val_loss: 0.6838 - val_binary_accuracy: 0.5517

Epoch 00092: val_loss did not improve from 0.68176
Epoch 93/100
34/34 [==============================] - 49s 1s/step - loss: 0.6615 - binary_accuracy: 0.6322 - val_loss: 0.6827 - val_binary_accuracy: 0.5575

Epoch 00093: val_loss did not improve from 0.68176
Epoch 94/100
34/34 [==============================] - 50s 1s/step - loss: 0.6587 - binary_accuracy: 0.6537 - val_loss: 0.6815 - val_binary_accuracy: 0.5575

Epoch 00094: val_loss improved from 0.68176 to 0.68154, saving model to model_5.best.hdf5
Epoch 95/100
34/34 [==============================] - 51s 1s/step - loss: 0.6569 - binary_accuracy: 0.6514 - val_loss: 0.6817 - val_binary_accuracy: 0.5575

Epoch 00095: val_loss did not improve from 0.68154
Epoch 96/100
34/34 [==============================] - 51s 1s/step - loss: 0.6562 - binary_accuracy: 0.6505 - val_loss: 0.6818 - val_binary_accuracy: 0.5575

Epoch 00096: val_loss did not improve from 0.68154
Epoch 97/100
34/34 [==============================] - 51s 1s/step - loss: 0.6574 - binary_accuracy: 0.6477 - val_loss: 0.6810 - val_binary_accuracy: 0.5575

Epoch 00097: val_loss improved from 0.68154 to 0.68103, saving model to model_5.best.hdf5
Epoch 98/100
34/34 [==============================] - 52s 2s/step - loss: 0.6579 - binary_accuracy: 0.6467 - val_loss: 0.6812 - val_binary_accuracy: 0.5517

Epoch 00098: val_loss did not improve from 0.68103
Epoch 99/100
34/34 [==============================] - 52s 2s/step - loss: 0.6579 - binary_accuracy: 0.6528 - val_loss: 0.6793 - val_binary_accuracy: 0.5575

Epoch 00099: val_loss improved from 0.68103 to 0.67933, saving model to model_5.best.hdf5
Epoch 100/100
34/34 [==============================] - 51s 1s/step - loss: 0.6556 - binary_accuracy: 0.6589 - val_loss: 0.6802 - val_binary_accuracy: 0.5690

Epoch 00100: val_loss did not improve from 0.67933
In [14]:
pred_Y_5, ground_truth_5, evaluation_dic_5 = predict_and_evaluate_model(model_5, "model_5", test_gen, steps=len(test_df)/64)
22/21 [==============================] - 23s 1s/step
Model prediction min: 0.415
Model prediction max: 0.588
  • The training and validation losses (accuracies) were decreasing (increasing) verly slowly during the training, indicating very slow learning.
  • The prediction distribution exhibits single peak for the pneumonia negative class.

The starting learning rate will be increased in the next model.

Model 6

In [15]:
# The same as model 5 with increased learning rate

# Model 5: 
## transfer layer: avg_pool (idx 427)
## learning rate E-3

model_6_pretrained = load_pretrained_model(pretrained_model=DenseNet121, transfer_layer='avg_pool', 
                          transfer_layer_idx=427)
model_6 = build_simpler_model(model_6_pretrained, lr=0.001)
save_model(model_6, 'model_6')
Pre-trained model layers and their trainability
input_2 False
zero_padding2d_3 False
conv1/conv False
conv1/bn False
conv1/relu False
zero_padding2d_4 False
pool1 False
conv2_block1_0_bn False
conv2_block1_0_relu False
conv2_block1_1_conv False
conv2_block1_1_bn False
conv2_block1_1_relu False
conv2_block1_2_conv False
conv2_block1_concat False
conv2_block2_0_bn False
conv2_block2_0_relu False
conv2_block2_1_conv False
conv2_block2_1_bn False
conv2_block2_1_relu False
conv2_block2_2_conv False
conv2_block2_concat False
conv2_block3_0_bn False
conv2_block3_0_relu False
conv2_block3_1_conv False
conv2_block3_1_bn False
conv2_block3_1_relu False
conv2_block3_2_conv False
conv2_block3_concat False
conv2_block4_0_bn False
conv2_block4_0_relu False
conv2_block4_1_conv False
conv2_block4_1_bn False
conv2_block4_1_relu False
conv2_block4_2_conv False
conv2_block4_concat False
conv2_block5_0_bn False
conv2_block5_0_relu False
conv2_block5_1_conv False
conv2_block5_1_bn False
conv2_block5_1_relu False
conv2_block5_2_conv False
conv2_block5_concat False
conv2_block6_0_bn False
conv2_block6_0_relu False
conv2_block6_1_conv False
conv2_block6_1_bn False
conv2_block6_1_relu False
conv2_block6_2_conv False
conv2_block6_concat False
pool2_bn False
pool2_relu False
pool2_conv False
pool2_pool False
conv3_block1_0_bn False
conv3_block1_0_relu False
conv3_block1_1_conv False
conv3_block1_1_bn False
conv3_block1_1_relu False
conv3_block1_2_conv False
conv3_block1_concat False
conv3_block2_0_bn False
conv3_block2_0_relu False
conv3_block2_1_conv False
conv3_block2_1_bn False
conv3_block2_1_relu False
conv3_block2_2_conv False
conv3_block2_concat False
conv3_block3_0_bn False
conv3_block3_0_relu False
conv3_block3_1_conv False
conv3_block3_1_bn False
conv3_block3_1_relu False
conv3_block3_2_conv False
conv3_block3_concat False
conv3_block4_0_bn False
conv3_block4_0_relu False
conv3_block4_1_conv False
conv3_block4_1_bn False
conv3_block4_1_relu False
conv3_block4_2_conv False
conv3_block4_concat False
conv3_block5_0_bn False
conv3_block5_0_relu False
conv3_block5_1_conv False
conv3_block5_1_bn False
conv3_block5_1_relu False
conv3_block5_2_conv False
conv3_block5_concat False
conv3_block6_0_bn False
conv3_block6_0_relu False
conv3_block6_1_conv False
conv3_block6_1_bn False
conv3_block6_1_relu False
conv3_block6_2_conv False
conv3_block6_concat False
conv3_block7_0_bn False
conv3_block7_0_relu False
conv3_block7_1_conv False
conv3_block7_1_bn False
conv3_block7_1_relu False
conv3_block7_2_conv False
conv3_block7_concat False
conv3_block8_0_bn False
conv3_block8_0_relu False
conv3_block8_1_conv False
conv3_block8_1_bn False
conv3_block8_1_relu False
conv3_block8_2_conv False
conv3_block8_concat False
conv3_block9_0_bn False
conv3_block9_0_relu False
conv3_block9_1_conv False
conv3_block9_1_bn False
conv3_block9_1_relu False
conv3_block9_2_conv False
conv3_block9_concat False
conv3_block10_0_bn False
conv3_block10_0_relu False
conv3_block10_1_conv False
conv3_block10_1_bn False
conv3_block10_1_relu False
conv3_block10_2_conv False
conv3_block10_concat False
conv3_block11_0_bn False
conv3_block11_0_relu False
conv3_block11_1_conv False
conv3_block11_1_bn False
conv3_block11_1_relu False
conv3_block11_2_conv False
conv3_block11_concat False
conv3_block12_0_bn False
conv3_block12_0_relu False
conv3_block12_1_conv False
conv3_block12_1_bn False
conv3_block12_1_relu False
conv3_block12_2_conv False
conv3_block12_concat False
pool3_bn False
pool3_relu False
pool3_conv False
pool3_pool False
conv4_block1_0_bn False
conv4_block1_0_relu False
conv4_block1_1_conv False
conv4_block1_1_bn False
conv4_block1_1_relu False
conv4_block1_2_conv False
conv4_block1_concat False
conv4_block2_0_bn False
conv4_block2_0_relu False
conv4_block2_1_conv False
conv4_block2_1_bn False
conv4_block2_1_relu False
conv4_block2_2_conv False
conv4_block2_concat False
conv4_block3_0_bn False
conv4_block3_0_relu False
conv4_block3_1_conv False
conv4_block3_1_bn False
conv4_block3_1_relu False
conv4_block3_2_conv False
conv4_block3_concat False
conv4_block4_0_bn False
conv4_block4_0_relu False
conv4_block4_1_conv False
conv4_block4_1_bn False
conv4_block4_1_relu False
conv4_block4_2_conv False
conv4_block4_concat False
conv4_block5_0_bn False
conv4_block5_0_relu False
conv4_block5_1_conv False
conv4_block5_1_bn False
conv4_block5_1_relu False
conv4_block5_2_conv False
conv4_block5_concat False
conv4_block6_0_bn False
conv4_block6_0_relu False
conv4_block6_1_conv False
conv4_block6_1_bn False
conv4_block6_1_relu False
conv4_block6_2_conv False
conv4_block6_concat False
conv4_block7_0_bn False
conv4_block7_0_relu False
conv4_block7_1_conv False
conv4_block7_1_bn False
conv4_block7_1_relu False
conv4_block7_2_conv False
conv4_block7_concat False
conv4_block8_0_bn False
conv4_block8_0_relu False
conv4_block8_1_conv False
conv4_block8_1_bn False
conv4_block8_1_relu False
conv4_block8_2_conv False
conv4_block8_concat False
conv4_block9_0_bn False
conv4_block9_0_relu False
conv4_block9_1_conv False
conv4_block9_1_bn False
conv4_block9_1_relu False
conv4_block9_2_conv False
conv4_block9_concat False
conv4_block10_0_bn False
conv4_block10_0_relu False
conv4_block10_1_conv False
conv4_block10_1_bn False
conv4_block10_1_relu False
conv4_block10_2_conv False
conv4_block10_concat False
conv4_block11_0_bn False
conv4_block11_0_relu False
conv4_block11_1_conv False
conv4_block11_1_bn False
conv4_block11_1_relu False
conv4_block11_2_conv False
conv4_block11_concat False
conv4_block12_0_bn False
conv4_block12_0_relu False
conv4_block12_1_conv False
conv4_block12_1_bn False
conv4_block12_1_relu False
conv4_block12_2_conv False
conv4_block12_concat False
conv4_block13_0_bn False
conv4_block13_0_relu False
conv4_block13_1_conv False
conv4_block13_1_bn False
conv4_block13_1_relu False
conv4_block13_2_conv False
conv4_block13_concat False
conv4_block14_0_bn False
conv4_block14_0_relu False
conv4_block14_1_conv False
conv4_block14_1_bn False
conv4_block14_1_relu False
conv4_block14_2_conv False
conv4_block14_concat False
conv4_block15_0_bn False
conv4_block15_0_relu False
conv4_block15_1_conv False
conv4_block15_1_bn False
conv4_block15_1_relu False
conv4_block15_2_conv False
conv4_block15_concat False
conv4_block16_0_bn False
conv4_block16_0_relu False
conv4_block16_1_conv False
conv4_block16_1_bn False
conv4_block16_1_relu False
conv4_block16_2_conv False
conv4_block16_concat False
conv4_block17_0_bn False
conv4_block17_0_relu False
conv4_block17_1_conv False
conv4_block17_1_bn False
conv4_block17_1_relu False
conv4_block17_2_conv False
conv4_block17_concat False
conv4_block18_0_bn False
conv4_block18_0_relu False
conv4_block18_1_conv False
conv4_block18_1_bn False
conv4_block18_1_relu False
conv4_block18_2_conv False
conv4_block18_concat False
conv4_block19_0_bn False
conv4_block19_0_relu False
conv4_block19_1_conv False
conv4_block19_1_bn False
conv4_block19_1_relu False
conv4_block19_2_conv False
conv4_block19_concat False
conv4_block20_0_bn False
conv4_block20_0_relu False
conv4_block20_1_conv False
conv4_block20_1_bn False
conv4_block20_1_relu False
conv4_block20_2_conv False
conv4_block20_concat False
conv4_block21_0_bn False
conv4_block21_0_relu False
conv4_block21_1_conv False
conv4_block21_1_bn False
conv4_block21_1_relu False
conv4_block21_2_conv False
conv4_block21_concat False
conv4_block22_0_bn False
conv4_block22_0_relu False
conv4_block22_1_conv False
conv4_block22_1_bn False
conv4_block22_1_relu False
conv4_block22_2_conv False
conv4_block22_concat False
conv4_block23_0_bn False
conv4_block23_0_relu False
conv4_block23_1_conv False
conv4_block23_1_bn False
conv4_block23_1_relu False
conv4_block23_2_conv False
conv4_block23_concat False
conv4_block24_0_bn False
conv4_block24_0_relu False
conv4_block24_1_conv False
conv4_block24_1_bn False
conv4_block24_1_relu False
conv4_block24_2_conv False
conv4_block24_concat False
pool4_bn False
pool4_relu False
pool4_conv False
pool4_pool False
conv5_block1_0_bn False
conv5_block1_0_relu False
conv5_block1_1_conv False
conv5_block1_1_bn False
conv5_block1_1_relu False
conv5_block1_2_conv False
conv5_block1_concat False
conv5_block2_0_bn False
conv5_block2_0_relu False
conv5_block2_1_conv False
conv5_block2_1_bn False
conv5_block2_1_relu False
conv5_block2_2_conv False
conv5_block2_concat False
conv5_block3_0_bn False
conv5_block3_0_relu False
conv5_block3_1_conv False
conv5_block3_1_bn False
conv5_block3_1_relu False
conv5_block3_2_conv False
conv5_block3_concat False
conv5_block4_0_bn False
conv5_block4_0_relu False
conv5_block4_1_conv False
conv5_block4_1_bn False
conv5_block4_1_relu False
conv5_block4_2_conv False
conv5_block4_concat False
conv5_block5_0_bn False
conv5_block5_0_relu False
conv5_block5_1_conv False
conv5_block5_1_bn False
conv5_block5_1_relu False
conv5_block5_2_conv False
conv5_block5_concat False
conv5_block6_0_bn False
conv5_block6_0_relu False
conv5_block6_1_conv False
conv5_block6_1_bn False
conv5_block6_1_relu False
conv5_block6_2_conv False
conv5_block6_concat False
conv5_block7_0_bn False
conv5_block7_0_relu False
conv5_block7_1_conv False
conv5_block7_1_bn False
conv5_block7_1_relu False
conv5_block7_2_conv False
conv5_block7_concat False
conv5_block8_0_bn False
conv5_block8_0_relu False
conv5_block8_1_conv False
conv5_block8_1_bn False
conv5_block8_1_relu False
conv5_block8_2_conv False
conv5_block8_concat False
conv5_block9_0_bn False
conv5_block9_0_relu False
conv5_block9_1_conv False
conv5_block9_1_bn False
conv5_block9_1_relu False
conv5_block9_2_conv False
conv5_block9_concat False
conv5_block10_0_bn False
conv5_block10_0_relu False
conv5_block10_1_conv False
conv5_block10_1_bn False
conv5_block10_1_relu False
conv5_block10_2_conv False
conv5_block10_concat False
conv5_block11_0_bn False
conv5_block11_0_relu False
conv5_block11_1_conv False
conv5_block11_1_bn False
conv5_block11_1_relu False
conv5_block11_2_conv False
conv5_block11_concat False
conv5_block12_0_bn False
conv5_block12_0_relu False
conv5_block12_1_conv False
conv5_block12_1_bn False
conv5_block12_1_relu False
conv5_block12_2_conv False
conv5_block12_concat False
conv5_block13_0_bn False
conv5_block13_0_relu False
conv5_block13_1_conv False
conv5_block13_1_bn False
conv5_block13_1_relu False
conv5_block13_2_conv False
conv5_block13_concat False
conv5_block14_0_bn False
conv5_block14_0_relu False
conv5_block14_1_conv False
conv5_block14_1_bn False
conv5_block14_1_relu False
conv5_block14_2_conv False
conv5_block14_concat False
conv5_block15_0_bn False
conv5_block15_0_relu False
conv5_block15_1_conv False
conv5_block15_1_bn False
conv5_block15_1_relu False
conv5_block15_2_conv False
conv5_block15_concat False
conv5_block16_0_bn False
conv5_block16_0_relu False
conv5_block16_1_conv False
conv5_block16_1_bn False
conv5_block16_1_relu False
conv5_block16_2_conv False
conv5_block16_concat False
bn False
relu False
avg_pool True
fc1000 True
Saved model architecture under model_6.json
In [16]:
model_6_hist = train_model(model_6, 'model_6', train_gen, valid_gen, epochs=60)
Epoch 1/60
34/34 [==============================] - 60s 2s/step - loss: 0.6908 - binary_accuracy: 0.5621 - val_loss: 0.6897 - val_binary_accuracy: 0.5747

Epoch 00001: val_loss improved from inf to 0.68970, saving model to model_6.best.hdf5
Epoch 2/60
34/34 [==============================] - 49s 1s/step - loss: 0.6860 - binary_accuracy: 0.6005 - val_loss: 0.6869 - val_binary_accuracy: 0.5747

Epoch 00002: val_loss improved from 0.68970 to 0.68691, saving model to model_6.best.hdf5
Epoch 3/60
34/34 [==============================] - 56s 2s/step - loss: 0.6827 - binary_accuracy: 0.6126 - val_loss: 0.6832 - val_binary_accuracy: 0.5632

Epoch 00003: val_loss improved from 0.68691 to 0.68321, saving model to model_6.best.hdf5
Epoch 4/60
34/34 [==============================] - 55s 2s/step - loss: 0.6794 - binary_accuracy: 0.6112 - val_loss: 0.6858 - val_binary_accuracy: 0.5230

Epoch 00004: val_loss did not improve from 0.68321
Epoch 5/60
34/34 [==============================] - 55s 2s/step - loss: 0.6746 - binary_accuracy: 0.6304 - val_loss: 0.6794 - val_binary_accuracy: 0.5402

Epoch 00005: val_loss improved from 0.68321 to 0.67935, saving model to model_6.best.hdf5
Epoch 6/60
34/34 [==============================] - 56s 2s/step - loss: 0.6720 - binary_accuracy: 0.6252 - val_loss: 0.6768 - val_binary_accuracy: 0.5517

Epoch 00006: val_loss improved from 0.67935 to 0.67683, saving model to model_6.best.hdf5
Epoch 7/60
34/34 [==============================] - 55s 2s/step - loss: 0.6671 - binary_accuracy: 0.6304 - val_loss: 0.6741 - val_binary_accuracy: 0.5402

Epoch 00007: val_loss improved from 0.67683 to 0.67407, saving model to model_6.best.hdf5
Epoch 8/60
34/34 [==============================] - 55s 2s/step - loss: 0.6646 - binary_accuracy: 0.6364 - val_loss: 0.6819 - val_binary_accuracy: 0.5287

Epoch 00008: val_loss did not improve from 0.67407
Epoch 9/60
34/34 [==============================] - 55s 2s/step - loss: 0.6634 - binary_accuracy: 0.6332 - val_loss: 0.6736 - val_binary_accuracy: 0.5517

Epoch 00009: val_loss improved from 0.67407 to 0.67362, saving model to model_6.best.hdf5
Epoch 10/60
34/34 [==============================] - 56s 2s/step - loss: 0.6606 - binary_accuracy: 0.6364 - val_loss: 0.6781 - val_binary_accuracy: 0.5460

Epoch 00010: val_loss did not improve from 0.67362
Epoch 11/60
34/34 [==============================] - 55s 2s/step - loss: 0.6595 - binary_accuracy: 0.6379 - val_loss: 0.6728 - val_binary_accuracy: 0.5517

Epoch 00011: val_loss improved from 0.67362 to 0.67277, saving model to model_6.best.hdf5
Epoch 12/60
34/34 [==============================] - 55s 2s/step - loss: 0.6567 - binary_accuracy: 0.6402 - val_loss: 0.6690 - val_binary_accuracy: 0.5230

Epoch 00012: val_loss improved from 0.67277 to 0.66895, saving model to model_6.best.hdf5
Epoch 13/60
34/34 [==============================] - 53s 2s/step - loss: 0.6526 - binary_accuracy: 0.6458 - val_loss: 0.6737 - val_binary_accuracy: 0.5460

Epoch 00013: val_loss did not improve from 0.66895
Epoch 14/60
34/34 [==============================] - 53s 2s/step - loss: 0.6508 - binary_accuracy: 0.6467 - val_loss: 0.6650 - val_binary_accuracy: 0.5172

Epoch 00014: val_loss improved from 0.66895 to 0.66496, saving model to model_6.best.hdf5
Epoch 15/60
34/34 [==============================] - 52s 2s/step - loss: 0.6487 - binary_accuracy: 0.6551 - val_loss: 0.6651 - val_binary_accuracy: 0.5172

Epoch 00015: val_loss did not improve from 0.66496
Epoch 16/60
34/34 [==============================] - 53s 2s/step - loss: 0.6500 - binary_accuracy: 0.6477 - val_loss: 0.6662 - val_binary_accuracy: 0.5460

Epoch 00016: val_loss did not improve from 0.66496
Epoch 17/60
34/34 [==============================] - 53s 2s/step - loss: 0.6477 - binary_accuracy: 0.6500 - val_loss: 0.6786 - val_binary_accuracy: 0.5460

Epoch 00017: val_loss did not improve from 0.66496
Epoch 18/60
34/34 [==============================] - 53s 2s/step - loss: 0.6452 - binary_accuracy: 0.6505 - val_loss: 0.6680 - val_binary_accuracy: 0.5287

Epoch 00018: val_loss did not improve from 0.66496
Epoch 19/60
34/34 [==============================] - 53s 2s/step - loss: 0.6436 - binary_accuracy: 0.6509 - val_loss: 0.6739 - val_binary_accuracy: 0.5517

Epoch 00019: val_loss did not improve from 0.66496
Epoch 20/60
34/34 [==============================] - 53s 2s/step - loss: 0.6413 - binary_accuracy: 0.6542 - val_loss: 0.6660 - val_binary_accuracy: 0.5230

Epoch 00020: val_loss did not improve from 0.66496
Epoch 21/60
34/34 [==============================] - 53s 2s/step - loss: 0.6416 - binary_accuracy: 0.6561 - val_loss: 0.6460 - val_binary_accuracy: 0.5517

Epoch 00021: val_loss improved from 0.66496 to 0.64598, saving model to model_6.best.hdf5
Epoch 22/60
34/34 [==============================] - 53s 2s/step - loss: 0.6424 - binary_accuracy: 0.6463 - val_loss: 0.6628 - val_binary_accuracy: 0.5287

Epoch 00022: val_loss did not improve from 0.64598
Epoch 23/60
34/34 [==============================] - 53s 2s/step - loss: 0.6387 - binary_accuracy: 0.6519 - val_loss: 0.6528 - val_binary_accuracy: 0.5460

Epoch 00023: val_loss did not improve from 0.64598
Epoch 24/60
34/34 [==============================] - 54s 2s/step - loss: 0.6363 - binary_accuracy: 0.6593 - val_loss: 0.6682 - val_binary_accuracy: 0.5460

Epoch 00024: val_loss did not improve from 0.64598
Epoch 25/60
34/34 [==============================] - 54s 2s/step - loss: 0.6338 - binary_accuracy: 0.6645 - val_loss: 0.6567 - val_binary_accuracy: 0.5345

Epoch 00025: val_loss did not improve from 0.64598
Epoch 26/60
34/34 [==============================] - 53s 2s/step - loss: 0.6292 - binary_accuracy: 0.6729 - val_loss: 0.6438 - val_binary_accuracy: 0.5747

Epoch 00026: val_loss improved from 0.64598 to 0.64379, saving model to model_6.best.hdf5
Epoch 27/60
34/34 [==============================] - 53s 2s/step - loss: 0.6361 - binary_accuracy: 0.6654 - val_loss: 0.6505 - val_binary_accuracy: 0.5402

Epoch 00027: val_loss did not improve from 0.64379
Epoch 28/60
34/34 [==============================] - 53s 2s/step - loss: 0.6321 - binary_accuracy: 0.6673 - val_loss: 0.6426 - val_binary_accuracy: 0.5402

Epoch 00028: val_loss improved from 0.64379 to 0.64256, saving model to model_6.best.hdf5
Epoch 29/60
34/34 [==============================] - 54s 2s/step - loss: 0.6335 - binary_accuracy: 0.6523 - val_loss: 0.6416 - val_binary_accuracy: 0.5805

Epoch 00029: val_loss improved from 0.64256 to 0.64162, saving model to model_6.best.hdf5
Epoch 30/60
34/34 [==============================] - 54s 2s/step - loss: 0.6265 - binary_accuracy: 0.6682 - val_loss: 0.6304 - val_binary_accuracy: 0.5977

Epoch 00030: val_loss improved from 0.64162 to 0.63042, saving model to model_6.best.hdf5
Epoch 31/60
34/34 [==============================] - 53s 2s/step - loss: 0.6286 - binary_accuracy: 0.6631 - val_loss: 0.6437 - val_binary_accuracy: 0.5977

Epoch 00031: val_loss did not improve from 0.63042
Epoch 32/60
34/34 [==============================] - 54s 2s/step - loss: 0.6309 - binary_accuracy: 0.6603 - val_loss: 0.6362 - val_binary_accuracy: 0.5805

Epoch 00032: val_loss did not improve from 0.63042
Epoch 33/60
34/34 [==============================] - 53s 2s/step - loss: 0.6274 - binary_accuracy: 0.6612 - val_loss: 0.6569 - val_binary_accuracy: 0.6149

Epoch 00033: val_loss did not improve from 0.63042
Epoch 34/60
34/34 [==============================] - 54s 2s/step - loss: 0.6209 - binary_accuracy: 0.6790 - val_loss: 0.6635 - val_binary_accuracy: 0.6092

Epoch 00034: val_loss did not improve from 0.63042
Epoch 35/60
34/34 [==============================] - 54s 2s/step - loss: 0.6256 - binary_accuracy: 0.6631 - val_loss: 0.6587 - val_binary_accuracy: 0.6322

Epoch 00035: val_loss did not improve from 0.63042
Epoch 36/60
34/34 [==============================] - 54s 2s/step - loss: 0.6159 - binary_accuracy: 0.6879 - val_loss: 0.6353 - val_binary_accuracy: 0.6609

Epoch 00036: val_loss did not improve from 0.63042
Epoch 37/60
34/34 [==============================] - 54s 2s/step - loss: 0.6207 - binary_accuracy: 0.6706 - val_loss: 0.6737 - val_binary_accuracy: 0.5920

Epoch 00037: val_loss did not improve from 0.63042
Epoch 38/60
34/34 [==============================] - 55s 2s/step - loss: 0.6215 - binary_accuracy: 0.6701 - val_loss: 0.6552 - val_binary_accuracy: 0.6207

Epoch 00038: val_loss did not improve from 0.63042
Epoch 39/60
34/34 [==============================] - 54s 2s/step - loss: 0.6179 - binary_accuracy: 0.6766 - val_loss: 0.6287 - val_binary_accuracy: 0.6437

Epoch 00039: val_loss improved from 0.63042 to 0.62870, saving model to model_6.best.hdf5
Epoch 40/60
34/34 [==============================] - 54s 2s/step - loss: 0.6231 - binary_accuracy: 0.6668 - val_loss: 0.7301 - val_binary_accuracy: 0.5402

Epoch 00040: val_loss did not improve from 0.62870
Epoch 41/60
34/34 [==============================] - 54s 2s/step - loss: 0.6248 - binary_accuracy: 0.6701 - val_loss: 0.6598 - val_binary_accuracy: 0.5977

Epoch 00041: val_loss did not improve from 0.62870
Epoch 42/60
34/34 [==============================] - 54s 2s/step - loss: 0.6213 - binary_accuracy: 0.6687 - val_loss: 0.6879 - val_binary_accuracy: 0.5632

Epoch 00042: val_loss did not improve from 0.62870
Epoch 43/60
34/34 [==============================] - 54s 2s/step - loss: 0.6216 - binary_accuracy: 0.6738 - val_loss: 0.6627 - val_binary_accuracy: 0.5977

Epoch 00043: val_loss did not improve from 0.62870
Epoch 44/60
34/34 [==============================] - 54s 2s/step - loss: 0.6112 - binary_accuracy: 0.6822 - val_loss: 0.7151 - val_binary_accuracy: 0.5575

Epoch 00044: val_loss did not improve from 0.62870
Epoch 45/60
34/34 [==============================] - 54s 2s/step - loss: 0.6180 - binary_accuracy: 0.6748 - val_loss: 0.7306 - val_binary_accuracy: 0.5402

Epoch 00045: val_loss did not improve from 0.62870
Epoch 46/60
34/34 [==============================] - 54s 2s/step - loss: 0.6082 - binary_accuracy: 0.6944 - val_loss: 0.6466 - val_binary_accuracy: 0.6437

Epoch 00046: val_loss did not improve from 0.62870
Epoch 47/60
34/34 [==============================] - 54s 2s/step - loss: 0.6150 - binary_accuracy: 0.6832 - val_loss: 0.6695 - val_binary_accuracy: 0.5920

Epoch 00047: val_loss did not improve from 0.62870
Epoch 48/60
34/34 [==============================] - 54s 2s/step - loss: 0.6170 - binary_accuracy: 0.6734 - val_loss: 0.6982 - val_binary_accuracy: 0.5632

Epoch 00048: val_loss did not improve from 0.62870
Epoch 49/60
34/34 [==============================] - 53s 2s/step - loss: 0.6234 - binary_accuracy: 0.6607 - val_loss: 0.6405 - val_binary_accuracy: 0.6379

Epoch 00049: val_loss did not improve from 0.62870
Epoch 50/60
34/34 [==============================] - 54s 2s/step - loss: 0.6099 - binary_accuracy: 0.6846 - val_loss: 0.6883 - val_binary_accuracy: 0.5862

Epoch 00050: val_loss did not improve from 0.62870
Epoch 51/60
34/34 [==============================] - 54s 2s/step - loss: 0.6160 - binary_accuracy: 0.6743 - val_loss: 0.7362 - val_binary_accuracy: 0.5517

Epoch 00051: val_loss did not improve from 0.62870
Epoch 52/60
34/34 [==============================] - 54s 2s/step - loss: 0.6126 - binary_accuracy: 0.6832 - val_loss: 0.6823 - val_binary_accuracy: 0.5920

Epoch 00052: val_loss did not improve from 0.62870
Epoch 53/60
34/34 [==============================] - 54s 2s/step - loss: 0.6169 - binary_accuracy: 0.6668 - val_loss: 0.7326 - val_binary_accuracy: 0.5575

Epoch 00053: val_loss did not improve from 0.62870
Epoch 54/60
34/34 [==============================] - 54s 2s/step - loss: 0.6103 - binary_accuracy: 0.6860 - val_loss: 0.7190 - val_binary_accuracy: 0.5575

Epoch 00054: val_loss did not improve from 0.62870
In [17]:
pred_Y_6, ground_truth_6, evaluation_dic_6 = predict_and_evaluate_model(model_6, "model_6", test_gen, steps=len(test_df)/64)
22/21 [==============================] - 22s 998ms/step
Model prediction min: 0.301
Model prediction max: 0.739
  • The validation loss (accuracy) did not improve during the training..
  • The prediction distribution exhibits single peak for the pneumonia positive class.

Model 7

In [20]:
model_7_pretrained = load_pretrained_model(pretrained_model=DenseNet121, transfer_layer='conv5_block1_1_conv', 
                          transfer_layer_idx=428)
model_7 = build_model(model_7_pretrained, dropout=0.2)

save_model(model_7, 'model_7')
Pre-trained model layers and their trainability
input_3 False
zero_padding2d_5 False
conv1/conv False
conv1/bn False
conv1/relu False
zero_padding2d_6 False
pool1 False
conv2_block1_0_bn False
conv2_block1_0_relu False
conv2_block1_1_conv False
conv2_block1_1_bn False
conv2_block1_1_relu False
conv2_block1_2_conv False
conv2_block1_concat False
conv2_block2_0_bn False
conv2_block2_0_relu False
conv2_block2_1_conv False
conv2_block2_1_bn False
conv2_block2_1_relu False
conv2_block2_2_conv False
conv2_block2_concat False
conv2_block3_0_bn False
conv2_block3_0_relu False
conv2_block3_1_conv False
conv2_block3_1_bn False
conv2_block3_1_relu False
conv2_block3_2_conv False
conv2_block3_concat False
conv2_block4_0_bn False
conv2_block4_0_relu False
conv2_block4_1_conv False
conv2_block4_1_bn False
conv2_block4_1_relu False
conv2_block4_2_conv False
conv2_block4_concat False
conv2_block5_0_bn False
conv2_block5_0_relu False
conv2_block5_1_conv False
conv2_block5_1_bn False
conv2_block5_1_relu False
conv2_block5_2_conv False
conv2_block5_concat False
conv2_block6_0_bn False
conv2_block6_0_relu False
conv2_block6_1_conv False
conv2_block6_1_bn False
conv2_block6_1_relu False
conv2_block6_2_conv False
conv2_block6_concat False
pool2_bn False
pool2_relu False
pool2_conv False
pool2_pool False
conv3_block1_0_bn False
conv3_block1_0_relu False
conv3_block1_1_conv False
conv3_block1_1_bn False
conv3_block1_1_relu False
conv3_block1_2_conv False
conv3_block1_concat False
conv3_block2_0_bn False
conv3_block2_0_relu False
conv3_block2_1_conv False
conv3_block2_1_bn False
conv3_block2_1_relu False
conv3_block2_2_conv False
conv3_block2_concat False
conv3_block3_0_bn False
conv3_block3_0_relu False
conv3_block3_1_conv False
conv3_block3_1_bn False
conv3_block3_1_relu False
conv3_block3_2_conv False
conv3_block3_concat False
conv3_block4_0_bn False
conv3_block4_0_relu False
conv3_block4_1_conv False
conv3_block4_1_bn False
conv3_block4_1_relu False
conv3_block4_2_conv False
conv3_block4_concat False
conv3_block5_0_bn False
conv3_block5_0_relu False
conv3_block5_1_conv False
conv3_block5_1_bn False
conv3_block5_1_relu False
conv3_block5_2_conv False
conv3_block5_concat False
conv3_block6_0_bn False
conv3_block6_0_relu False
conv3_block6_1_conv False
conv3_block6_1_bn False
conv3_block6_1_relu False
conv3_block6_2_conv False
conv3_block6_concat False
conv3_block7_0_bn False
conv3_block7_0_relu False
conv3_block7_1_conv False
conv3_block7_1_bn False
conv3_block7_1_relu False
conv3_block7_2_conv False
conv3_block7_concat False
conv3_block8_0_bn False
conv3_block8_0_relu False
conv3_block8_1_conv False
conv3_block8_1_bn False
conv3_block8_1_relu False
conv3_block8_2_conv False
conv3_block8_concat False
conv3_block9_0_bn False
conv3_block9_0_relu False
conv3_block9_1_conv False
conv3_block9_1_bn False
conv3_block9_1_relu False
conv3_block9_2_conv False
conv3_block9_concat False
conv3_block10_0_bn False
conv3_block10_0_relu False
conv3_block10_1_conv False
conv3_block10_1_bn False
conv3_block10_1_relu False
conv3_block10_2_conv False
conv3_block10_concat False
conv3_block11_0_bn False
conv3_block11_0_relu False
conv3_block11_1_conv False
conv3_block11_1_bn False
conv3_block11_1_relu False
conv3_block11_2_conv False
conv3_block11_concat False
conv3_block12_0_bn False
conv3_block12_0_relu False
conv3_block12_1_conv False
conv3_block12_1_bn False
conv3_block12_1_relu False
conv3_block12_2_conv False
conv3_block12_concat False
pool3_bn False
pool3_relu False
pool3_conv False
pool3_pool False
conv4_block1_0_bn False
conv4_block1_0_relu False
conv4_block1_1_conv False
conv4_block1_1_bn False
conv4_block1_1_relu False
conv4_block1_2_conv False
conv4_block1_concat False
conv4_block2_0_bn False
conv4_block2_0_relu False
conv4_block2_1_conv False
conv4_block2_1_bn False
conv4_block2_1_relu False
conv4_block2_2_conv False
conv4_block2_concat False
conv4_block3_0_bn False
conv4_block3_0_relu False
conv4_block3_1_conv False
conv4_block3_1_bn False
conv4_block3_1_relu False
conv4_block3_2_conv False
conv4_block3_concat False
conv4_block4_0_bn False
conv4_block4_0_relu False
conv4_block4_1_conv False
conv4_block4_1_bn False
conv4_block4_1_relu False
conv4_block4_2_conv False
conv4_block4_concat False
conv4_block5_0_bn False
conv4_block5_0_relu False
conv4_block5_1_conv False
conv4_block5_1_bn False
conv4_block5_1_relu False
conv4_block5_2_conv False
conv4_block5_concat False
conv4_block6_0_bn False
conv4_block6_0_relu False
conv4_block6_1_conv False
conv4_block6_1_bn False
conv4_block6_1_relu False
conv4_block6_2_conv False
conv4_block6_concat False
conv4_block7_0_bn False
conv4_block7_0_relu False
conv4_block7_1_conv False
conv4_block7_1_bn False
conv4_block7_1_relu False
conv4_block7_2_conv False
conv4_block7_concat False
conv4_block8_0_bn False
conv4_block8_0_relu False
conv4_block8_1_conv False
conv4_block8_1_bn False
conv4_block8_1_relu False
conv4_block8_2_conv False
conv4_block8_concat False
conv4_block9_0_bn False
conv4_block9_0_relu False
conv4_block9_1_conv False
conv4_block9_1_bn False
conv4_block9_1_relu False
conv4_block9_2_conv False
conv4_block9_concat False
conv4_block10_0_bn False
conv4_block10_0_relu False
conv4_block10_1_conv False
conv4_block10_1_bn False
conv4_block10_1_relu False
conv4_block10_2_conv False
conv4_block10_concat False
conv4_block11_0_bn False
conv4_block11_0_relu False
conv4_block11_1_conv False
conv4_block11_1_bn False
conv4_block11_1_relu False
conv4_block11_2_conv False
conv4_block11_concat False
conv4_block12_0_bn False
conv4_block12_0_relu False
conv4_block12_1_conv False
conv4_block12_1_bn False
conv4_block12_1_relu False
conv4_block12_2_conv False
conv4_block12_concat False
conv4_block13_0_bn False
conv4_block13_0_relu False
conv4_block13_1_conv False
conv4_block13_1_bn False
conv4_block13_1_relu False
conv4_block13_2_conv False
conv4_block13_concat False
conv4_block14_0_bn False
conv4_block14_0_relu False
conv4_block14_1_conv False
conv4_block14_1_bn False
conv4_block14_1_relu False
conv4_block14_2_conv False
conv4_block14_concat False
conv4_block15_0_bn False
conv4_block15_0_relu False
conv4_block15_1_conv False
conv4_block15_1_bn False
conv4_block15_1_relu False
conv4_block15_2_conv False
conv4_block15_concat False
conv4_block16_0_bn False
conv4_block16_0_relu False
conv4_block16_1_conv False
conv4_block16_1_bn False
conv4_block16_1_relu False
conv4_block16_2_conv False
conv4_block16_concat False
conv4_block17_0_bn False
conv4_block17_0_relu False
conv4_block17_1_conv False
conv4_block17_1_bn False
conv4_block17_1_relu False
conv4_block17_2_conv False
conv4_block17_concat False
conv4_block18_0_bn False
conv4_block18_0_relu False
conv4_block18_1_conv False
conv4_block18_1_bn False
conv4_block18_1_relu False
conv4_block18_2_conv False
conv4_block18_concat False
conv4_block19_0_bn False
conv4_block19_0_relu False
conv4_block19_1_conv False
conv4_block19_1_bn False
conv4_block19_1_relu False
conv4_block19_2_conv False
conv4_block19_concat False
conv4_block20_0_bn False
conv4_block20_0_relu False
conv4_block20_1_conv False
conv4_block20_1_bn False
conv4_block20_1_relu False
conv4_block20_2_conv False
conv4_block20_concat False
conv4_block21_0_bn False
conv4_block21_0_relu False
conv4_block21_1_conv False
conv4_block21_1_bn False
conv4_block21_1_relu False
conv4_block21_2_conv False
conv4_block21_concat False
conv4_block22_0_bn False
conv4_block22_0_relu False
conv4_block22_1_conv False
conv4_block22_1_bn False
conv4_block22_1_relu False
conv4_block22_2_conv False
conv4_block22_concat False
conv4_block23_0_bn False
conv4_block23_0_relu False
conv4_block23_1_conv False
conv4_block23_1_bn False
conv4_block23_1_relu False
conv4_block23_2_conv False
conv4_block23_concat False
conv4_block24_0_bn False
conv4_block24_0_relu False
conv4_block24_1_conv False
conv4_block24_1_bn False
conv4_block24_1_relu False
conv4_block24_2_conv False
conv4_block24_concat False
pool4_bn False
pool4_relu False
pool4_conv False
pool4_pool False
conv5_block1_0_bn False
conv5_block1_0_relu False
conv5_block1_1_conv False
conv5_block1_1_bn False
conv5_block1_1_relu False
conv5_block1_2_conv False
conv5_block1_concat False
conv5_block2_0_bn False
conv5_block2_0_relu False
conv5_block2_1_conv False
conv5_block2_1_bn False
conv5_block2_1_relu False
conv5_block2_2_conv False
conv5_block2_concat False
conv5_block3_0_bn False
conv5_block3_0_relu False
conv5_block3_1_conv False
conv5_block3_1_bn False
conv5_block3_1_relu False
conv5_block3_2_conv False
conv5_block3_concat False
conv5_block4_0_bn False
conv5_block4_0_relu False
conv5_block4_1_conv False
conv5_block4_1_bn False
conv5_block4_1_relu False
conv5_block4_2_conv False
conv5_block4_concat False
conv5_block5_0_bn False
conv5_block5_0_relu False
conv5_block5_1_conv False
conv5_block5_1_bn False
conv5_block5_1_relu False
conv5_block5_2_conv False
conv5_block5_concat False
conv5_block6_0_bn False
conv5_block6_0_relu False
conv5_block6_1_conv False
conv5_block6_1_bn False
conv5_block6_1_relu False
conv5_block6_2_conv False
conv5_block6_concat False
conv5_block7_0_bn False
conv5_block7_0_relu False
conv5_block7_1_conv False
conv5_block7_1_bn False
conv5_block7_1_relu False
conv5_block7_2_conv False
conv5_block7_concat False
conv5_block8_0_bn False
conv5_block8_0_relu False
conv5_block8_1_conv False
conv5_block8_1_bn False
conv5_block8_1_relu False
conv5_block8_2_conv False
conv5_block8_concat False
conv5_block9_0_bn False
conv5_block9_0_relu False
conv5_block9_1_conv False
conv5_block9_1_bn False
conv5_block9_1_relu False
conv5_block9_2_conv False
conv5_block9_concat False
conv5_block10_0_bn False
conv5_block10_0_relu False
conv5_block10_1_conv False
conv5_block10_1_bn False
conv5_block10_1_relu False
conv5_block10_2_conv False
conv5_block10_concat False
conv5_block11_0_bn False
conv5_block11_0_relu False
conv5_block11_1_conv False
conv5_block11_1_bn False
conv5_block11_1_relu False
conv5_block11_2_conv False
conv5_block11_concat False
conv5_block12_0_bn False
conv5_block12_0_relu False
conv5_block12_1_conv False
conv5_block12_1_bn False
conv5_block12_1_relu False
conv5_block12_2_conv False
conv5_block12_concat False
conv5_block13_0_bn False
conv5_block13_0_relu False
conv5_block13_1_conv False
conv5_block13_1_bn False
conv5_block13_1_relu False
conv5_block13_2_conv False
conv5_block13_concat False
conv5_block14_0_bn False
conv5_block14_0_relu False
conv5_block14_1_conv False
conv5_block14_1_bn False
conv5_block14_1_relu False
conv5_block14_2_conv False
conv5_block14_concat False
conv5_block15_0_bn False
conv5_block15_0_relu False
conv5_block15_1_conv False
conv5_block15_1_bn False
conv5_block15_1_relu False
conv5_block15_2_conv False
conv5_block15_concat False
conv5_block16_0_bn False
conv5_block16_0_relu False
conv5_block16_1_conv False
conv5_block16_1_bn False
conv5_block16_1_relu False
conv5_block16_2_conv False
conv5_block16_concat False
bn False
relu False
avg_pool False
fc1000 True
Saved model architecture under model_7.json
In [21]:
model_7_hist = train_model(model_7, 'model_7', train_gen, valid_gen, epochs=100)
Epoch 1/100
34/34 [==============================] - 60s 2s/step - loss: 0.6931 - binary_accuracy: 0.4986 - val_loss: 0.6945 - val_binary_accuracy: 0.5057

Epoch 00001: val_loss improved from inf to 0.69446, saving model to model_7.best.hdf5
Epoch 2/100
34/34 [==============================] - 51s 1s/step - loss: 0.6928 - binary_accuracy: 0.5075 - val_loss: 0.6947 - val_binary_accuracy: 0.5057

Epoch 00002: val_loss did not improve from 0.69446
Epoch 3/100
34/34 [==============================] - 55s 2s/step - loss: 0.6925 - binary_accuracy: 0.5397 - val_loss: 0.6951 - val_binary_accuracy: 0.5000

Epoch 00003: val_loss did not improve from 0.69446
Epoch 4/100
34/34 [==============================] - 54s 2s/step - loss: 0.6915 - binary_accuracy: 0.5528 - val_loss: 0.6959 - val_binary_accuracy: 0.5402

Epoch 00004: val_loss did not improve from 0.69446
Epoch 5/100
34/34 [==============================] - 55s 2s/step - loss: 0.6893 - binary_accuracy: 0.5836 - val_loss: 0.6988 - val_binary_accuracy: 0.5000

Epoch 00005: val_loss did not improve from 0.69446
Epoch 6/100
34/34 [==============================] - 54s 2s/step - loss: 0.6836 - binary_accuracy: 0.5888 - val_loss: 0.7073 - val_binary_accuracy: 0.5000

Epoch 00006: val_loss did not improve from 0.69446
Epoch 7/100
34/34 [==============================] - 54s 2s/step - loss: 0.6764 - binary_accuracy: 0.6028 - val_loss: 0.7192 - val_binary_accuracy: 0.5172

Epoch 00007: val_loss did not improve from 0.69446
Epoch 8/100
34/34 [==============================] - 55s 2s/step - loss: 0.6752 - binary_accuracy: 0.5879 - val_loss: 0.7247 - val_binary_accuracy: 0.5115

Epoch 00008: val_loss did not improve from 0.69446
Epoch 9/100
34/34 [==============================] - 55s 2s/step - loss: 0.6661 - binary_accuracy: 0.6065 - val_loss: 0.7337 - val_binary_accuracy: 0.5460

Epoch 00009: val_loss did not improve from 0.69446
Epoch 10/100
34/34 [==============================] - 54s 2s/step - loss: 0.6630 - binary_accuracy: 0.6178 - val_loss: 0.7385 - val_binary_accuracy: 0.5575

Epoch 00010: val_loss did not improve from 0.69446
Epoch 11/100
34/34 [==============================] - 54s 2s/step - loss: 0.6574 - binary_accuracy: 0.6262 - val_loss: 0.7465 - val_binary_accuracy: 0.5287

Epoch 00011: val_loss did not improve from 0.69446
Epoch 12/100
34/34 [==============================] - 54s 2s/step - loss: 0.6564 - binary_accuracy: 0.6136 - val_loss: 0.7462 - val_binary_accuracy: 0.5345

Epoch 00012: val_loss did not improve from 0.69446
Epoch 13/100
34/34 [==============================] - 54s 2s/step - loss: 0.6504 - binary_accuracy: 0.6262 - val_loss: 0.7504 - val_binary_accuracy: 0.5747

Epoch 00013: val_loss did not improve from 0.69446
Epoch 14/100
34/34 [==============================] - 54s 2s/step - loss: 0.6523 - binary_accuracy: 0.6257 - val_loss: 0.7453 - val_binary_accuracy: 0.5747

Epoch 00014: val_loss did not improve from 0.69446
Epoch 15/100
34/34 [==============================] - 54s 2s/step - loss: 0.6563 - binary_accuracy: 0.6224 - val_loss: 0.7423 - val_binary_accuracy: 0.5460

Epoch 00015: val_loss did not improve from 0.69446
Epoch 16/100
34/34 [==============================] - 54s 2s/step - loss: 0.6452 - binary_accuracy: 0.6425 - val_loss: 0.7440 - val_binary_accuracy: 0.5690

Epoch 00016: val_loss did not improve from 0.69446
In [ ]:
pred_Y_7, ground_truth_7, evaluation_dic_7 = predict_and_evaluate_model(model_7, "model_7", test_gen, steps=len(test_df)/64)

Model performance summary

In [2]:
data = {'Model': ['Model_1', 'Model_2', 'Model_3', 'Model_4', 'Model_5', 'Model_6', 'Model_7'],
       'Transfer layer': ['conv5_block16_1_conv', 'conv5_block16_1_conv', 'conv5_block16_1_conv', 
                          'avg_pool', 'avg_pool', 'avg_pool', 'avg_pool'],
       'Transfer layer idx': [420, 420, 420, 427, 427, 427, 427],
       'Embedding model': ['standard', 'standard', 'standard', 'standard', 'simpler', 'simpler', 'standard'],
       'Starting learning rate' : ['E-4', 'E-4', 'E-3', 'E-4', 'E-4', 'E-3', 'E-4'],
       'Dropout': [0.2, 0.5, 0.2, 0.2, 0.0, 0.0],
       'Best Epoch /Epochs set)' : ['5/100', '5/100', '14/100', '27/50', '100/100', '39/60', '.'],
       'Prediction distribution': ['0.08 - 0.93', '0.16 - 0.87', '0.37 - 0.63', '0.07 - 0.93', '0.42 - 0.59', '0.30 - 0.74',
                                  '.'],
       'Prediction distribution: peaks': ['Two (wrong ratio)', 'Single', 'Single', 
                                          'Two (ratio closer to 80/20)', 'Single', 'Single','.'],
       'AUC' : [0.61, 0.56, 0.50, 0.57, 0.64, 0.65, 0],
       'AP Score' : [0.28, 0.23, 0.20, 0.24, 0.31, 0.29,0],
       'Comment' : ['Highly overfitting  after first epochs', 'Highly overfitting', 'Underfitting', 
                    'Overfitting', 'Slow learning', 'No learning', '.']}
models_summary = pd.DataFrame(data=data)
models_summary
Out[2]:
Model Transfer layer Transfer layer idx Embedding model Starting learning rate Dropout Best Epoch /Epochs set) Prediction distribution Prediction distribution: peaks AUC AP Score Comment
0 Model_1 conv5_block16_1_conv 420 standard E-4 0.2 5/100 0.08 - 0.93 Two (wrong ratio) 0.61 0.28 Highly overfitting after first epochs
1 Model_2 conv5_block16_1_conv 420 standard E-4 0.5 5/100 0.16 - 0.87 Single 0.56 0.23 Highly overfitting
2 Model_3 conv5_block16_1_conv 420 standard E-3 0.2 14/100 0.37 - 0.63 Single 0.50 0.20 Underfitting
3 Model_4 avg_pool 427 standard E-4 0.2 27/50 0.07 - 0.93 Two (ratio closer to 80/20) 0.57 0.24 Overfitting
4 Model_5 avg_pool 427 simpler E-4 0.0 100/100 0.42 - 0.59 Single 0.64 0.31 Slow learning
5 Model_6 avg_pool 427 simpler E-3 0.0 39/60 0.30 - 0.74 Single 0.65 0.29 No learning

Evaluation of chosen model

  • Model 4 was chosen for further evaluation
In [3]:
# Loading the numpy arrays contaning ground truth and predictions
pred_Y_4 = np.load('model_4_pred_Y.npy')
pred_Y_GT = np.load('model_4_GT.numpy.npy')
In [4]:
# Calculating of false positive rate, true positive rate, precision, recall for different treshold values
fpr_4, tpr_4, thresholds_auroc_4 = roc_curve(pred_Y_GT, pred_Y_4)
precision_4, recall_4, thresholds_pr_4 = precision_recall_curve(pred_Y_GT, pred_Y_4)

# Calculating treshold for different F1 score values
f1_4, thresholds_pr_4 = plot_f1_tresh(pred_Y_GT, pred_Y_4)
/opt/conda/lib/python3.7/site-packages/ipykernel_launcher.py:114: RuntimeWarning: invalid value encountered in true_divide
In [5]:
# Checking values at the end of the f1 score numpy array
f1_4[-10:]
Out[5]:
array([0.02120141, 0.0141844 , 0.00711744, 0.00714286, 0.00716846,
       0.00719424, 0.00722022,        nan,        nan, 0.        ])
In [6]:
# Chosing treshold maximizing F1 score
f1_4_max = np.max(f1_4[:-3]) #omitting NaN values 
f1_4_max
Out[6]:
0.3530377668308703
In [31]:
# Extracting precision and recall for the chosen treshold
idx = np.where(f1_4[:-3] == f1_4[:-3].max())
tresh_4_best = thresholds_pr_4[idx]
print(f'Treshold maximizing f1 score: {tresh_4_best[0]:.3}')
print(f'For this treshold precision equals to: {precision_4[idx][0]:.3}')
print(f'For this treshold recall equals to: {recall_4[idx][0]:.3}')
Treshold maximizing f1 score: 0.171
For this treshold precision equals to: 0.228
For this treshold recall equals to: 0.785
In [17]:
# Classification of the scan on the basis of chosen treshold
evaluation_df_4 = make_evaluation_df(pred_Y_GT, pred_Y_4, tresh_4_best[0])
Ground_truth Pred Pred_tresh
0 0 0.737593 1.0
1 0 0.194913 1.0
2 0 0.290274 1.0
3 1 0.126209 0.0
4 1 0.407251 1.0
In [20]:
# Confusion matrix
confusion_matrix_4 = confusion_matrix(evaluation_df_4['Ground_truth'], evaluation_df_4['Pred_tresh'])
confusion_matrix_4
Out[20]:
array([[367, 729],
       [ 59, 215]])
  • The model has very high number of false positives
In [32]:
# Calculating specificity
specificity_4 = confusion_matrix_4[0,0] / (confusion_matrix_4[0,0] + confusion_matrix_4[0,1])
print(f'For this treshold specificity equals to: {specificity_4:.3}')
For this treshold specificity equals to: 0.335
In [42]:
# Displaying images from the test set with first number corresponding to the ground truth 
# and the second to the model prediction
test_x, test_y = next(test_gen)

fig, m_axs = plt.subplots(8, 8, figsize = (20, 20))
i = 0

for (image, label, plot) in zip(test_x, test_y, m_axs.flatten()):
    plot.imshow(image[:,:,0], cmap = 'bone')
    if label == 1: 
        if pred_Y_4[i] >= tresh_4_best[0]:
            plot.set_title('1, 1')
        else:
            plot.set_title('1, 0')
    else:
        if pred_Y_4[i] >= tresh_4_best[0]:
            plot.set_title('0, 1')
        else:
            plot.set_title('0, 0')
    plot.axis('off')
    i += 1

Project summary

The chosen model 4 has maximal F1 score amounting to 0.353, which is comparable to Radiologist 2 from Ref [1], who had f1 score 0.356. However, the model has too many false positives, thus it should be further tuned to achieve better performance. Further hyperparameter tuning of model 4 could be performed, with following steps:

  • higher dropout, i.e. 0.5
  • freezing avg_pooling layer and retraining the last dense layer together with layers attached after the pre-trained model.
  • introducing regularization for the layers attached after the pre-trained model.
  • changing batch size

[1] Rajpurkar, Pranav, Jeremy Irvin, Kaylie Zhu, Brandon Yang, Hershel Mehta, Tony Duan, Daisy Ding, et al. “CheXNet: Radiologist-Level Pneumonia Detection on Chest X-Rays with Deep Learning.” ArXiv:1711.05225 [Cs, Stat], December 25, 2017